From hansa at mincom.com Thu Oct 1 00:07:56 1998 From: hansa at mincom.com (Hans Andersen) Date: Thu, 01 Oct 1998 14:07:56 +1000 Subject: Ph.D. Thesis on Neural/Fuzzy Control available. Message-ID: <3613001C.E1D7C8F5@mincom.com> Hi, My recently awarded Ph.D. thesis is now available from the following web-page: http://www.elec.uq.edu.au/~annis/papers/HansThesis/theCOEM.html The abstract and other details are included at the bottom of this message. Regards, Hans Christian Andersen, | E-mail: | hansa at mincom.com | | Department of Ph.D. work: | Department of Computer Science and Electrical Engineering, | University of Queensland, | St Lucia, Brisbane, Qld, 4072, | Australia. ---------------------------------------------------------------------- The Controller Output Error Method ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Ph.D. Thesis by: Hans Christian Asminn Andersen Supervised by: Dr Louis Westphal in the field of: Electrical Engineering at the: Department of Computer Science and Electrical Engineering, University of Queensland, Brisbane, Australia. Abstract: This thesis proposes the Controller Output Error Method (COEM) for adaptation of neural and fuzzy controllers. Most existing methods of neural adaptive control employ some kind of plant model which is used to infer the error of the control signal from the error at the plant output. The error of the control signal is used to adjust the controller parameters such that some cost function is optimized. Schemes of this kind are generally described as being indirect. Unlike these, COEM is direct since it does not require a plant model in order to calculate the error of the control signal. Instead it calculates the control signal error by performing input matching. This entails generating two control signals; the first control signal is applied to the plant and the second is inferred from the plant's response to the first control signal. The controller output error is the difference between these two control signals and is used by the COEM to adapt the controller. The method is shown to be a viable strategy for adaptation of controllers based on nonlinear function approximation. This is done by use of mathematical analysis and simulation experiments. It is proven that, provided a given controller is sufficiently close to optimal at the commencement of COEM-adaptation, its parameters will converge, and the control signal and the output of the plant being controlled will be both bounded and convergent. Experiments demonstrate that the method yields performance which is comparable or superior to that yielded by other neural and linear adaptive control paradigms. In addition to these results, this thesis shows the following: * The convergence time of the COEM may be greatly reduced by performing more than one adaptation during each sampling period. * It is possible to filter a reference signal in order to help ensure that reachable targets are set for the plant. * An adaptive fuzzy system may be prevented from corrupting the intuitive inter-pretation upon which it was originally designed. * Controllers adapted by COEM will perform best if a suitable sampling rate is selected. * The COEM may be expected to work as well on fuzzy controllers as it does on neural controllers. Furthermore, the extent of the functional equivalence between certain types of neural networks and fuzzy inference systems is clarified, and a new approach to the matrix formulation of a range of fuzzy inference systems is proposed. From kehagias at egnatia.ee.auth.gr Thu Oct 1 14:10:05 1998 From: kehagias at egnatia.ee.auth.gr (Thanasis Kehagias) Date: Thu, 01 Oct 1998 11:10:05 -0700 Subject: THE DATA ALLOCATION PROBLEM Message-ID: <3.0.5.32.19981001111005.007a08e0@egnatia.ee.auth.gr> THE DATA ALLOCATION PROBLEM I currently got interested in the following problem and I would be grateful for any feedback you can give me (mailto:kehagias at egnatia.ee.auth.gr). The Setup: Consider a collection of data: y(1), y(2), y(3), ..., generated by more than one sources. At time t one of the sources is activated (perhaps randomly) and generates the datum y(t). We want to identify the number of active sources and extract some information regarding each source (e.g. an input/output model, or some statistics such as mean value, standard deviation etc. of the source's output). No a priori information is available regarding the number, behavior etc. of the sources. In particular, the observed data are unlabelled, i.e. it is not known which source is active at time t. The Online Data Allocation Task: It seems to me that in such a situation the major task is data allocation. I mean this: if the observed data were partitioned into groups, each group containing data generated by a single source, then each data group could be used to train a model for the respective source. Generally speaking, training on clean data groups should not be too hard. However, since the data are not labelled, it is not immediately clear how to allocate them between groups. As I will explain a little later, the problem seems harder for the online case (with a continuously incoming stream of data) than for the offline case, where a finite data set is involved. The Convergence Question: Special cases of the above problem and various solutions have appeared in the literature. I am interested in obtaining quite general sufficient (and necessary ?) conditions for an online data allocation process to converge to a correct solution. By "correct solution", I mean a partition of the observed data into groups such that every group contains predominantly data from one source and every source corresponds to only one data group. The convergence conditions should be fairly general, so as to allow a unified treatment of many different data allocation algorithms and different kinds of sources (and data). We have obtained some results, which appear in our recent book (announced in a separate posting) and in a series of papers (also announced in a separate posting). I summarize our results in my web site at http://skiron.control.ee.auth.gr/~kehagias/thn/thn030.htm At this point I am interested in getting some feedback regarding: possible approaches to the problem, relevant biblio pointers and so on. I already have a modestly sized bibliography on this. I will summarize all responses and post. ___________________________________________________________________ Ath. Kehagias --Assistant Prof. of Mathematics, American College of Thessaloniki --Research Ass., Dept. of Electrical and Computer Eng. Aristotle Univ., Thessaloniki, GR54006, GREECE --email: kehagias at egnatia.ee.auth.gr, kehagias at ac.anatolia.edu.gr --web: http://skiron.control.ee.auth.gr/~kehagias/index.htm From Frederic.Alexandre at loria.fr Thu Oct 1 05:01:19 1998 From: Frederic.Alexandre at loria.fr (Frederic Alexandre) Date: Thu, 1 Oct 1998 11:01:19 +0200 (MET DST) Subject: Postdoctoral positions Message-ID: <199810010901.LAA13409@wernicke.loria.fr> LORIA/INRIA Lorraine computer science laboratory in Nancy, France Two postdoctoral positions are available for developping computational neurosciences models in Nancy, France, from January to July 99. Our team: the CORTEX team is developping connectionist models, inspired with biology, for perception, reasoning and autonomous behavior. Belonging to a computer science lab, our main goal is to propose effective models for robotics, speech and image processing. The two positions: they will insert in our two current projects: First a model of hippocampus-cortex connections for the internal representation of the environment; Second a model of the prefrontal cortex for the intelligent exploitation of this representation (strategy, planning). The candidates: they should be well-trained in connectionist modeling, first from a computer science point of view. Experience in robotics, autonomous behavior and neurosciences would be highly appreciated. For further information, contact: ---------------------------------------------------------------------------- Frederic ALEXANDRE Tel: (+33/0) 3 83 59 20 53 INRIA-Lorraine/LORIA-CNRS Fax: (+33/0) 3 83 41 30 79 BP 239 E-mail: falex at loria.fr 54506 Vandoeuvre-les-Nancy Cedex http://www.loria.fr/~falex FRANCE ---------------------------------------------------------------------------- From simon.schultz at psy.ox.ac.uk Thu Oct 1 05:56:51 1998 From: simon.schultz at psy.ox.ac.uk (Simon Schultz) Date: Thu, 01 Oct 1998 10:56:51 +0100 Subject: Thesis available Message-ID: <361351E3.2F1C@psy.ox.ac.uk> The following D.Phil. thesis is now available for downloading: ----------------------------------------------------------------- Information encoding in the mammalian cerebral cortex Simon R. Schultz Corpus Christi College, Oxford Short abstract: This thesis describes new techniques for studying information encoding and transmission in the mammalian nervous system. The underlying theme of the thesis is the use of information theory to quantitatively study real and model neuronal systems. The thesis begins with an analytical calculation of the information that can be conveyed by a feedforward network of threshold-linear neurons. The replica-symmetric solution for the mutual information is found to be valid at all but low noise values. This method is then used to make a quantitative calculation of the information that can be conveyed by the Schaffer collaterals, which project from hippocampal subregion CA3 to subregion CA1. The effects on information transmission of a number of details of the anatomy of the projection are explored, including convergence, divergence and topography of connectivity. Information theory is then applied to the analysis of data from neurophysiological recordings, by quantifying the information encoded in the responses (action potentials) of neural ensembles about environmental correlates. The decoding approach to estimating the information contained in the responses of populations of cells is examined in the limit of short time windows. It is shown that in this physiologically pertinent limit, decoding algorithms which estimate the full probability distribution must fail, whereas maximum likelihood algorithms remain accurate. The metric content, or amount of structure in the neuronal activity, is found to have a residual component at short time windows which is related to the instantaneous information transmission rate. The equation for mutual information is then approximated by a series expansion to second order, and it is found that while as has been previously noted, the first order terms depend only on the firing rates, the second order terms break down into rate and correlational components of the information. This leads to a new procedure for quantifying the relative contributions of correlations (such as synchronisation) and firing rates to neural information encoding. The practicality of this procedure is demonstrated by applying it to data recorded from the primate medial and inferior temporal lobes. ----------------------------------------------------------------- The thesis is available either as a gzipped postscript file: http://www.mrc-bbc.ox.ac.uk/~schultz/thesis.ps.gz or as individual chapters: http://www.mrc-bbc.ox.ac.uk/~schultz/theschaps.html -- ----------------------------------------------------------------------- Simon Schultz Department of Experimental Psychology also: University of Oxford Corpus Christi College South Parks Rd., Oxford OX1 3UD Oxford OX1 4JF Phone: +44-1865-271419 Fax: +44-1865-310447 http://www.mrc-bbc.ox.ac.uk/~schultz/ ----------------------------------------------------------------------- From oby at cs.tu-berlin.de Fri Oct 2 09:55:19 1998 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Fri, 2 Oct 1998 15:55:19 +0200 (MET DST) Subject: preprints available Message-ID: <199810021355.PAA02999@pollux.cs.tu-berlin.de> Dear connectionists, attached please find abstracts and preprint locations of five manuscripts on: optical recording of brain activity: 1. tissue optics simulations for depth-resolved optical recording 2. ICA analysis of optical recording data biological modelling: 3. contrast adaptation, fast synaptic depression, and Infomax in visual cortical neurons 4. the role of non-linear lateral interactions in cortical map formation ANN theory: 5. optimal hyperplane classifiers for pseudo-Euclidean and pairwise data Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Simulation of Scanning Laser Techniques for Optical Imaging of Blood-Related Intrinsic Signals M. Stetter and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Optical Imaging of intrinsic signals detects neural activation patterns by taking video images of the local activity-related changes in the light intensity reflected from neural tissue (intrinsic signals). At red light (605nm), these signals are mainly caused by local variations of the tissue absorption following deoxygenation of blood. In this work, we characterize the image generation process during Optical Imaging by Monte Carlo simulations of light propagation through a homogeneous model tissue equipped with a local absorber. Conventional video-imaging and Scanning Laser imaging are compared to each other. We find that, compared to video imaging, Scanning Laser techniques drastically increase both the contrast and the lateral resolution of optical recordings. Also, the maximum depth up to which the signals can be detected, is increased by roughly a factor of 2 using Scanning Laser Optical Imaging. Further, the radial profile of the diffuse reflectance pattern for each pixel is subject to changes which correlate with the depth of the absorber within the tissue. We suggest a detection geometry for the online measurement of these radial profiles, which can be realized by modifying a standard Scanning Laser Ophthalmoscope. in: Journal of the Optical Society of America A, in press available at: http://ni.cs.tu-berlin.de/publications/#journals ----------------------------------------------------------------------------- Blind separation of spatial signal patterns from optical imaging records. I. Schiessl^1, M. Stetter^1, J. Mayhew^2, S. Askew^2, N. McLoughlin^3, J. Levitt^4, J. Lund^4, and K. Obermayer^5. 1 Fachbereich Informatik, Technische Universitaet Berlin 2 AIVRU, University of Sheffield 3 Department of Neurobiology, Harvard Medical School 4 Institute of Ophthalmology, UCL Optical imaging of intrinsic signals measures two-dimensional neuronal activity patterns by detecting small activity-related changes in the light reflectance of neural tissue. We test, to what extend blind source separation methods, which are based on the spatial independence of different signal components, are suitable for the separation of these neural-activity related signal components from nonspecific background variations of the light reflectance. Two ICA algorithms (Infomax and kurtosis optimization) and blind source separation by extended spatial decorrelation are compared to each other with respect to their robustness against sensor noise, and are applied to optical recordings from macaque primary visual cortex. We find that extended spatial decorrelation is superior to both the ICA algorithms and standard methods, because it explicitely takes advantage of the spatial smoothness of the intrinsic signal components. in: Proceedings of the ICA '99 conference, 1999 (accepted) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex. P. Adorjan and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin The contrast response function (CRF) of many neurons in the primary visual cortex saturates, and shifts towards higher contrast values following prolonged presentation of high contrast visual stimuli. Using a recurrent neural network of excitatory spiking neurons with adapting synapses we show that both effects could be explained by a fast and a slow component in the synaptic adaptation. The fast component - a short term synaptic depression component - leads to a saturation of the CRF and a phase advance in the cortical cells' response to high contrast stimuli. The slow component is derived from an adaptation of the probability of the synaptic transmitter release, and changes such that the mutual information between the input and the output of a cortical neuron is maximal. This component - given by the infomax learning rule - explains contrast adaptation of the averaged membrane potential (DC component) as well as the surprising experimental results, that the stimulus modulated component (F1 component) of a cortical cell's membrane potential adapt only weakly. Based on our results we propose a new experimental method to estimate the strength of the effective excitatory feedback to a cortical neuron, and we also suggest a relatively simple experimental test to justify our hypothesized synaptic mechanism for contrast adaptation. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- The role of lateral cortical competition in ocular dominance development. C. Piepenbrock and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin Lateral competition within a layer of neurons sharpens and localizes the response to an input stimulus. Here, we investigate a model for the activity dependent development of ocular dominance maps which allows to vary the degree of lateral competition. For weak competition, it resembles a correlation-based learning model and for strong competition, it becomes a self-organizing map. Thus, in the regime of weak competition the receptive fields are shaped by the second order statistics of the input patterns, whereas in the regime of strong competition, the higher moments and ``features'' of the individual patterns become important. When correlated localized stimuli from two eyes drive the cortical development we find (i) that a topographic map and binocular, localized receptive fields emerge when the degree of competition exceeds a critical value and (ii) that receptive fields exhibit eye dominance beyond a second critical value. For anti-correlated activity between the eyes, the second order statistics drive the system to develop ocular dominance even for weak competition, but no topography emerges. Topography is established only beyond a critical degree of competition. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Classification on pairwise proximity data. T. Graepel, R. Herbrich, P. Bollmann-Sdorra, and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of the data items and is thus more general than the standard approach of using Euclidean feature vectors, from which pairwise proximities can always be calculated. Our first approach is based on a combined linear embedding and classification procedure resulting in an extension of the Optimal Hyperplane algorithm to pseudo-Euclidean data. As an alternative we present another approach based on a linear threshold model in the proximity values themselves, which is optimized using Structural Risk Minimization. We show that prior knowledge about the problem can be incorporated by the choice of distance measures and examine different metrics w.r.t. their generalization. Finally, the algorithms are successfully applied to protein structure data and to data from the cat's cerebral cortex. They show better performance than K-nearest-neighbor classification. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference From harnad at coglit.soton.ac.uk Fri Oct 2 12:33:34 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Fri, 2 Oct 1998 17:33:34 +0100 (BST) Subject: Social Cognitive Bias: PSYCOLOQUYy Call for Commentary Message-ID: Krueger: Social Cognitive Bias The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc To retrieve the article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.46 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.46.social-bias.1.krueger AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY: My contention is that social psychological research has depicted social perception in an excessively negative light by relying too much on demonstrations of various irrational biases. Normative models of good judgment have been too restrictive, and the prevalent testing strategy has equated good judgment with the truth of a null hypothesis. Rejections of such null hypotheses have then been interpreted as evidence for bias. I am particularly interested in learning how psychologists and methodologists respond to the idea that the use of multiple theories and methods will improve our understanding of social perception. I realize that my proposal is incomplete because breaking the predominance of the single ruling inference strategy (Null Hypothesis Significance Testing) may make it harder to draw comparisons between studies. How can the field preserve its coherence, while abandoning its traditional ways? ----------------------------------------------------------------------- psycoloquy.98.9.46.social-bias.1.krueger Fri Oct 2 1998 ISSN 1055-0143 (21 paragraphs, 41 references, 4 notes, 647 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1998 Joachim Krueger THE BET ON BIAS: A FOREGONE CONCLUSION? Joachim Krueger Department of Psychology Brown University, Box 1853 Providence, RI 02912 USA Joachim_Krueger at Brown.edu http://www.brown.edu/Departments/Psychology/faculty/krueger.html ABSTRACT: Social psychology has painted a picture of human misbehavior and irrational thinking. For example, prominent social cognitive biases are said to distort consensus estimation, self perception, and causal attribution. The thesis of this target article is that the roots of this negativistic paradigm lie in the joint application of narrow normative theories and statistical testing methods designed to reject those theories. Suggestions for balancing the prevalent paradigm include (a) modifications to the ruling rituals of Null Hypothesis Significance Testing, (b) revisions of what is considered a normative response, and (c) increased emphasis on individual differences in judgment. KEYWORDS: Bayes' rule, bias, hypothesis testing, individual differences probability, rationality, significance testing, social cognition, statistical inference To retrieve the article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.46 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.46.social-bias.1.krueger From sporns at nsi.edu Fri Oct 2 15:37:36 1998 From: sporns at nsi.edu (Olaf Sporns) Date: Fri, 02 Oct 1998 19:37:36 +0000 Subject: job applications invited Message-ID: <36152B7F.6158FBDB@nsi.edu> Job applications are invited: POSTDOCTORAL FELLOWS W.M. KECK MACHINE PSYCHOLOGY LABORATORY The Neurosciences Institute, located in San Diego, California, invites applications for POSTDOCTORAL FELLOWSHIPS to study biologically based models of behaving real world devices (robots) as part of the newly established W.M. Keck Machine Psychology Laboratory. Continuing previous research conducted at the Institute, this Laboratory will be focusing on the construction of autonomous robots, the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in computational neuroscience, robotics, computer science, behavioral or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three referees to: Dr. Olaf Sporns The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Fax: 619-626-2099 email: sporns at nsi.edu URL: http://www.nsi.edu. URL: http://www.nsi.edu/users/sporns/. Key reference: Almassy, N., G.M. Edelman, O. Sporns (1998) Behavioral constraints in the development of neuronal properties: A cortical model embedded in a real-world device. Cerebral Cortex 8:346-361. From ingber at ingber.com Sat Oct 3 09:37:05 1998 From: ingber at ingber.com (Lester Ingber) Date: Sat, 3 Oct 1998 08:37:05 -0500 Subject: Open R&D Positions in Computational Finance/Chicago Message-ID: <19981003083705.A7988@ingber.com> * Open R&D Positions in Computational Finance/Chicago DRW Investments, LLC, a proprietary trading firm based at the Chicago Mercantile Exchange, with a branch office in London, is expanding its research department to support trading. * Programmer/Analyst -- Full Time At least 1-2 years experience programming in C or C++. Must have excellent background in Math, Physics, or similar disciplines. Flexible hours in intense environment. Requires strong commitment to several ongoing projects with shifting priorities. See http://www.ingber.com/ for some papers and code used on current projects. Please email Lester Ingber with a resume or any questions regarding these positions. * Graduate Student(s) -- Part Time (1 or 2) We would like to sponsor theses in computational finance that might impact our trading practices. Would require at least weekly face-to-face contact with DRW personnel. See http://www.ingber.com/ for some papers on current projects. Please email Lester Ingber with a resume or any questions regarding these positions. * Systems Administrator -- Full Time The primary role is to oversee and manage a heterogeneous network. Essential skills required are Windows NT, any flavor of Unix, TCP/IP networking and some programming skills in any language. 1-2 years experience required. Please email Man Wei Tam with a resume or any questions regarding the position. -- /* Lester Ingber Lester Ingber Research * * PO Box 06440 Wacker Dr PO Sears Tower Chicago, IL 60606-0440 * * http://www.ingber.com/ ingber at ingber.com ingber at alumni.caltech.edu */ From bernabe at cnmx4-fddi0.imse.cnm.es Mon Oct 5 04:46:54 1998 From: bernabe at cnmx4-fddi0.imse.cnm.es (Bernabe Linares B.) Date: Mon, 5 Oct 1998 10:46:54 +0200 Subject: No subject Message-ID: <199810050846.KAA11836@cnm12.cnm.es> A non-text attachment was scrubbed... Name: not available Type: text Size: 529 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/970595a7/attachment.ksh From becker at curie.psychology.mcmaster.ca Mon Oct 5 21:38:24 1998 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Mon, 5 Oct 1998 21:38:24 -0400 (EDT) Subject: faculty positions Message-ID: Dear connectionists, There are a number of open faculty positions at McMaster University which may interest you or your colleagues. Please feel free to pass this message on or post to other sites. Position in Psychology: Appended to the bottom of this message is an advertisement for a faculty position in the Department of Psychology at McMaster University, targeting Cognitive Psychology or Animal Behaviour. Computational modellers whose research intersects with the study of brain and behaviour are encouraged to apply. Please drop me a note if you plan on applying. Positions in Computer Science and Software Engineering: The Department of Computing and Software at McMaster also has a number of open positions. Several areas are being targeted but all outstanding applicants will be considered. See http://www.cas.mcmaster.ca This department has recently moved into the Faculty of Engineering and is expanding rapidly. Positions in Electrical Engineering: The Department of Electrical and Computer Engineering is also expanding rapidly, with information technology having been designated a priority area at McMaster. This department is presently targeting the areas of signal processing, computer networking and VLSI. For further details, see http://ece.mcmaster.ca/job_ft2.htm McMaster University is the first, and to our knowledge, the only university to offer an Honours Degree in Neural Computation. It is by many standards Canada's most research-intensive university - financial support from governments, foundations and business for our research projects is, on a per capita basis, the highest in the country. We have been recognized for several years as Canada's most innovative university. McMaster's Psychology Department is considered by many to be the best in Canada - for example, it ranks highest in the country in terms of publications per faculty member. Sincerely, Sue Becker, Associate Professor Dept. of Psychology -------------------------------------------------------------------------- FACULTY POSITION IN COGNITIVE PSYCHOLOGY/ANIMAL BEHAVIOUR The Department of Psychology at McMaster University invites applications from candidates eligible to be sponsored for a Natural Sciences and Engineering Research Council University Faculty Award (UFA), an award that is directed toward women. We plan to sponsor an application for a UFA, and will offer the successful applicant a tenure track position at the level of Assistant Professor. The duration of the UFA, 3 years initially with a possibility of renewal for a further 2 years, will count toward the normal probationary period for tenure. We are seeking candidates with an active research program in either animal behaviour or cognitive psychology. Preference for the cognition opening is for someone with an interest in memory or decision making, and whose research program extends to neuroscience/neuropsychology domains. Applications are encouraged from all qualified candidates, including aboriginal peoples, persons with disabilities, and members of visible minorities. Interested candidates should consult the eligibility criteria for the UFA on the NSERC website: http://www.nserc.ca/programs/sf/UFA_e. To apply, send a curriculum vitae, a short statement of research interests, a publication list with selected reprints, and three letters of reference to: Dr. Bruce Milliken, Department of Psychology, McMaster University, Hamilton, Ontario, CANADA L8S 4K1. Closing date for applications and supporting material is November 15, 1998. From at at coglit.soton.ac.uk Tue Oct 6 04:05:13 1998 From: at at coglit.soton.ac.uk (Adriaan Tijsseling) Date: Tue, 6 Oct 1998 09:05:13 +0100 Subject: PhD Thesis available: Connectionist Models of Categorization Message-ID: The following PhD thesis is available on our website (http://cogito.psy.soton.ac.uk/~at/CALM/): Connectionist Models of Categorization: A Dynamical View of Cognition by Adriaan Tijsseling Abstract The functional role of altered similarity structure in categorization is analyzed. 'Categorical Perception' (CP) occurs when equal-sized physical differences in the signals arriving at our sensory receptors are perceived as smaller within categories and larger between categories (Harnad, 1987). Our hypothesis is that it is by modifying the similarity between internal representations that successful categorization is achieved. This effect depends in part on the iconicity of the inputs, which induces a similarity preserving structure in the internal representations. Categorizations based on the similarity between stimuli are easier to learn than contra-iconic categorization; it is mainly to modify the latter in the service of categorization that the characteristic compression/separation of CP occurs. This hypothesis was tested in a series of neural net simulations of studies on category learning in human subject. The nets are first pre-exposed to the inputs and then given feedback on their performance. The behavior of the resulting networks was then analyzed and compared to human performance. Before it is given feedback, the network discriminates and categorizes input based on the inherent similarity of the input structure. With corrective feedback the net moves its internal representations away from category boundaries. The effect is that similarity of patterns that belong to different categories is decreased, while similarity of patterns from the same category is increased (CP). Neural net simulations make it possible to look inside a hypothetical black box of how categorization may be accomplished; it is shown how increased attention to one or more dimensions in the input and the salience of input features affect category learning. Moreover, the observed 'warping' of similarity space in the service of categorization can provide useful functionality by creating compact, bounded chunks (Miller, 1965) with category names that can then be combined into higher-order categories described by the symbol strings of natural language and the language of thought (Greco, Cangelosi, & Harnad, 1997). The dynamic models of categorization of the kind analyzed here can be extended to make them powerful models of neuro-symbolic processing (Casey, 1997) and a fruitful territory for future research. From terry at salk.edu Tue Oct 6 13:28:27 1998 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 6 Oct 1998 10:28:27 -0700 (PDT) Subject: NEURAL COMPUTATION 10:8 Message-ID: <199810061728.KAA29775@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 8 - November 15, 1998 ARTICLE Competition for Neurotrophic Factors: Mathematical Analysis T. Elliott and N. R. Shadbolt NOTE Why Does the Somatosensory Homunculus Have Hands Next to Face and Feet Next to Genitals?: An Hypothesis Martha J. Farah LETTERS Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input Richard Kempter, Wulfram Gerstner, J. Leo van Hemmen and H. Wagner Connecting Cortical And Behavioral Dynamics: Bimanual Coordination V. K. Jirsa, A. Fuchs, J.A.S. Kelso Constructive Incremental Learning from Only Local Information Stefan Schaal and Christopher G. Atkeson Information Maximization and Independent Component Analysis: Is There a Difference? D. Obradovic and G. Deco An Alternative Perspective on Adaptive Independent Component Analysis Algorithms Mark Girolami Density Estimation by Mixture Models with Smoothing Priors Akio Utsugi Complexity Issues in Natural Gradient Descent Method for Training Multilayer Perceptrons Howard Hua Yang and Shun-ichi Amari Almost Linear VC-Dimension Bounds for Piecewise Polynomial Networks Peter L. Bartlett, Vitaly Maiorov, and Ron Meir The Diabolo Classifier Holger Schwenk Online Learning from Finite Training Sets and Robustness to Input Bias Peter Sollich and David Barber Anti-Predictable Sequences: Harder to Predict Than Random Sequences Huaiyu Zhu and Wolfgang Kinzel ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From a.burkitt at medoto.unimelb.edu.au Tue Oct 6 19:30:48 1998 From: a.burkitt at medoto.unimelb.edu.au (Anthony BURKITT) Date: Wed, 07 Oct 1998 09:30:48 +1000 Subject: Preprints available Message-ID: <60E1B9CE4896D111A22700E02910059714719E@mail.medoto.unimelb.EDU.AU> The following two papers on the analysis of integrate and fire neurons are available at the site: http://www.medoto.unimelb.edu.au/people/burkitta/Pubs.html 1. Analysis of integrate and fire neurons: synchronization of synaptic input and spike output 2. New technique for analyzing integrate and fire neurons I'd welcome comments. Cheers, Tony Burkitt =============================================== Analysis of integrate and fire neurons: synchronization of synaptic input and spike output A. N. Burkitt and G. M. Clark A new technique for analyzing the probability distribution of output spikes for the integrate and fire model is presented. This technique enables us to investigate models with arbitrary synaptic response functions that incorporate both leakage across the membrane and a rise time of the postsynaptic potential. The results, which are compared with numerical simulations, are exact in the limit of a large number of small amplitude inputs. This method is applied to the synchronization problem, in which we examine the relationship between the spread in arrival times of the inputs (the temporal jitter of the synaptic input) and the resultant spread in the times at which the output spikes are generated (output jitter). The results of previous studies, which indicated that the ratio of the output jitter to the input jitter is consistently less than one and that it decreases for increasing numbers of inputs, are confirmed for three classes of the integrate and fire model. In addition to the previously identified factors of axonal propagation times and synaptic jitter, we identify the variation in the spike generating thresholds of the neurons and the variation in the number of active inputs as being important factors that determine the timing jitter in layered networks. Previously observed phase differences between optimally and suboptimally stimulated neurons may be understood in terms of the relative time taken to reach threshold. http://www.medoto.unimelb.edu.au/people/burkitta/synch.ps.zip Accepted for publication in Neural Computation (to appear) =============================================== New technique for analyzing integrate and fire neurons A. N. Burkitt and G. M. Clark By integrating over the distribution of arrival times of the afferent postsynaptic potentials (PSPs), the probability density of the summed potential is calculated. An output spike is generated when the potential reaches threshold, and the probability density of output spikes (the first-passage density) is calculated. This "integrated-input" technique enables the investigation of models that incorporate the decay of the membrane potential and rise time of the synaptic current. PSPs with a distribution of amplitudes, including inhibitory PSPs, are also analyzed. The results are exact in the limit of large numbers of small amplitude inputs. http://www.medoto.unimelb.edu.au/people/burkitta/cns98.ps.zip Presented at CNS*98, to appear in Neurocomputing (in 1999). ====================ooOOOoo==================== Anthony N. Burkitt The Bionic Ear Institute 384-388 Albert Street East Melbourne, VIC 3002 Australia Email: a.burkitt at medoto.unimelb.edu.au http://www.medoto.unimelb.edu.au/people/burkitta Phone: +61 - 3 - 9283 7510 Fax: +61 - 3 - 9283 7518 =====================ooOOOoo=================== From Annette_Burton at Brown.edu Tue Oct 6 12:22:56 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Tue, 6 Oct 1998 12:22:56 -0400 Subject: PROGRAM ANNOUNCEMENT Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science have just received a new NSF-supported Interdisciplinary Graduate Education, Research and training (IGERT) program, with support for Graduate Students. Could you please post the following message on your list. Thank you. Katherine Demuth Department of Cognitive and Lingusitic Sciences Brown University ------------- Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY GRADUATE TRAINING PROGRAM in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES Deadline for Applications: January 1, 1999 Brown University is actively recruiting graduate students for a new NSF-supported Interdisciplinary Graduate Education, Research and Training (IGERT) program in "Learning and Action in the Face of Uncertainty: Cognitive, Computational and Statistical Approaches". The use of probabilistic models and statistical methods has had a major impact on our understanding of language, vision, action, and reasoning. This training program provides students with the opportunity to integrate a detailed study of human or artificial systems for language acquisition and use, visual processing, action, and reasoning with appropriate mathematical and computational models. Students will be enrolled in one of the three participating departments (Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science) and will study an interdisciplinary program of courses in topics such as statistical estimation, cognitive processes, linguistics, and computational models. The aim of this program is to provide promising students with a mix of mathematical, computational and experimental expertise to carry out multidisciplinary collaborative research across the disciplines of Applied Mathematics, Computer Science, and Cognitive Science. Interested students should apply to the participating department closest to their area of interest and expertise, and should indicate their interest in the IGERT training program in their application. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program, application procedures, and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT or contact Dr. Julie Sedivy Department of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 USA Julie_Sedivy at brown.edu ***************************** Katherine Demuth Dept. of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 TEL: (401) 863-1053 FAX: (401) 863-2255 From barry at dcs.rhbnc.ac.uk Wed Oct 7 06:50:53 1998 From: barry at dcs.rhbnc.ac.uk (Barry Rising) Date: Wed, 7 Oct 1998 11:50:53 +0100 (BST) Subject: ANNOUNCEMENT: ASPeCT Fraud Detection Workshop Message-ID: ASPeCT (Advanced Security for Personal Communications Technologies) is a Europe wide research project which is part of the European Commissions' ACTS programme. One of the topics addressed by ASPeCT has been the investigation of fraud detection techniques for third generation mobile telecommunications using Neural Networks. On behalf ASPeCT I would like to invite you to the ASPeCT Fraud Detection Workshop to be held on Friday 16th October 1998 at Savill Court Conference Centre, Egham, Surrey, England. The aims of the workshop are to promote awareness of the problems of fraud detection in mobile telecom technology and credit risk assessment using automated techniques. The speakers are from the ASPeCT fraud detection project and experts from industry who will relay their experiences in this rapidly changing research area. The cost of the workshop is 50 UK Pounds, this will include lunch, coffee breaks and proceedings. Places are strictly limited. For more details see http://www.dcs.rhbnc.ac.uk/~barry or contact me via email. (barry at dcs.rhbnc.ac.uk) More information can be found about the ASPeCT project at: http://www.esat.kuleuven.ac.be/cosic/aspect Here you can also down-load some of the reports produced by the project. Best regards, Barry Rising Research Assistant, ASPeCT From higdon at stat.Duke.EDU Thu Oct 8 12:33:59 1998 From: higdon at stat.Duke.EDU (David Higdon) Date: Thu, 8 Oct 1998 12:33:59 -0400 (EDT) Subject: Postdoc Opportunity Message-ID: *** Postdoctoral Research Associate *** Institute of Statistics & Decision Sciences Duke University Applications are invited for a position as a post-doctoral research associate in the Institute of Statistics and Decision Sciences, Duke University. This is an initial three-year appointment, with a target starting date of May 1st, 1999. The position arises as part of a new multi-disciplinary research project on multi-scale modeling and simulation in scientific inference, funded by NSF. This project, conducted through the Center for Multi-Scale Modeling and Distributed Computing at Duke, combines research teams from statistics, hydrology, mathematics and computer science, and has key research focuses in challenging problems of spatial statistical modeling and advanced statistical computation. Applied project components concern important problems in environmental and petroleum hydrology. The project begins in May 1999 with an initial three year term. The Postdoctoral Research Associate will focus on research related to the development of novel multi-resolution spatial models, advanced statistical computation, and methodology for integrating data sources in an applied setting. The associate will work closely with ISDS researchers and investigators and postdoctoral associates in mathematics, computing and hydrology. Suitable applicants will have a PhD in Statistics or a related field. We particularly seek candidates with experience in spatial modeling and advanced scientific computing, and with inclinations towards cross-disciplinary research. Applications from suitably qualified women and minority candidates are particularly encouraged. Duke University is an Equal Opportunity/Affirmative Action Employer. For additional information, visit our web page at www.stat.duke.edu To apply, please send CV and 3 letters of reference to: Search Committee Box 90251 Duke University; Durham NC 27708-0251 Informal inquiries may be directed to Dave Higdon at higdon at stat.duke.edu or Mike West at mw at stat.duke.edu. From mel at lnc.usc.edu Thu Oct 8 15:45:57 1998 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 08 Oct 1998 12:45:57 -0700 Subject: NIPS Travel Grants Message-ID: <361D1675.D68DF3E9@lnc.usc.edu> Neural Information Processing Systems Conference NIPS*98 ------- TRAVEL GRANTS ------- ****** Deadline Oct. 19 ******** Limited funds will be available to support the travel of young investigators, post-doctoral fellows, and graduate students to NIPS. Awards will be based on both merit and need. The amount of aid will generally not exceed $400 for domestic (US) travel; increased amounts may be available for participants travelling from overseas. Conference registration is not covered by travel awards. In order to speed processing and reduce applicant workload, applications for travel grants MUST be submitted this year via the NIPS Conference web page form for this purpose: http://www.cs.cmu.edu/Groups/NIPS/NIPS98/Travel.html With the creation of this automated application system, there is no need to submit hardcopies of your application, as was indicated in the (now superceded) printed brochure. Even if you have already submitted an application to the NIPS Foundation office, you must still submit the web page form. Deadline for submission is Oct. 19. Notification of award will be emailed in early November. Travel award checks in US$ may be picked up upon registration at the conference. A photocopy of your airline ticket will be required. Applications that arrive after the deadline will be collected and considered as a separate pool in case any awarded travel funds are left uncollected at the conference. Bartlett Mel NIPS*98 Treasurer -- Bartlett W. Mel (213)740-0334, -3397(lab) Assistant Professor of Biomedical Engineering (213)740-0343 fax University of Southern California, OHE 500 mel at lnc.usc.edu, http://lnc.usc.edu US Mail: BME Department, MC 1451, USC, Los Angeles, CA 90089 Fedex: 3650 McClintock Ave, 500 Olin Hall, LA, CA 90089 From audeb at dai.ed.ac.uk Fri Oct 9 13:34:59 1998 From: audeb at dai.ed.ac.uk (Aude Billard) Date: Fri, 9 Oct 1998 18:34:59 +0100 Subject: Preprints available Message-ID: <19324.199810091734@osprey> The following paper "DRAMA, a connectionist architecture for control and learning in autonomous robots" is to appear in Adaptive Behaviour Journal, vol. 7:1 (January 1999). A preprint of the paper is available at the site: http://www.dai.ed.ac.uk/daidb/people/homes/audeb/publication.html This paper reports on the development of a novel connectionist architecture used for on-line learning of spatio-temporal regularities and time series in discrete sequences of inputs of an autonomous mobile robot. An on-line version of my PhD thesis, of which the paper reports some aspects, will soon be available. I am very grateful for any comments. Thank you for transmitting the message. Aude Billard ================================================================= DRAMA, a connectionist architecture for control and learning in autonomous robots, Billard A. and Hayes G. (1998), In Adaptive Behaviour Journal, vol. 7:1. This work proposes a connectionist architecture, DRAMA, for dynamic control and learning of autonomous robots. DRAMA stands for dynamical recurrent associative memory architecture. It is a time-delay recurrent neural network, using Hebbian update rules. It allows learning of spatio-temporal regularities and time series in discrete sequences of inputs, in the face of an important amount of noise. The first part of this paper gives the mathematical description of the architecture and analyses theoretically and through numerical simulations its performance. The second part of this paper reports on the implementation of DRAMA in simulated and physical robotic experiments. Training and rehearsal of the DRAMA architecture is computationally fast and inexpensive, which makes the model particularly suitable for controlling `computationally-challenged' robots. In the experiments, we use a basic hardware system with very limited computational capability and show that our robot can carry out real time computation and on-line learning of relatively complex cognitive tasks. In these experiments, two autonomous robots wander randomly in a fixed environment, collecting information about its elements. By mutually associating information of their sensors and actuators, they learn about physical regularities underlying their experience of varying stimuli. The agents learn also from their mutual interactions. We use a teacher-learner scenario, based on mutual following of the two agents, to enable transmission of a vocabulary from one robot to the other. Keywords: Time-delay recurrent neural network; Hebbian learning; spatio-temporal associations; unsupervised dynamical learning; autonomous robots. From emmanuel.pothos at psy.ox.ac.uk Mon Oct 12 14:24:42 1998 From: emmanuel.pothos at psy.ox.ac.uk (Emmanuel Pothos) Date: Mon, 12 Oct 1998 19:24:42 +0100 (BST) Subject: post doc position Message-ID: UNIVERSITY OF WALES, BANGOR RESEARCH ASSISTANT JOB ADVERT SALARY GRADE 1B - 15,735 - 18,275 A Research Officer is sought to work with Dr. Emmanuel Pothos and Professor Nick Chater on an exciting project relating to human categorisation. In the first instance, the project would involve computer implementation and analysis of an information theory model of classification, developed by Pothos and Chater. Subsequently, human experimental work would be designed to investigate the psychological plausibility of the project. Although the emphasis will be on the psychological implications of the theory, statistical applications will be also be pursued, with a view to addressing clustering, data mining problems. Candidates should have a high level of computer programming competence (preferably C, or Matlab); expertise in the statistics clustering literature, and human experimental skills are desirable but not necessary. Thus, we welcome applications with a Ph.D. or a good MSc in computer science, statistics, cognitive psychology and related disciplines. The post will be for one year commencing January 1999 (although this is flexible), held at the University of Wales, Bangor, one of the fastest growing departments in the UK, which received a 5A rating in the most recent RAE. Application forms and further particulars are available by contacting : Personnel Services, University of Wales, Bangor, Gwynedd, LL57 2DG. tel.: 01248-382926/388132. e-mail: pos020 at bangor.ac.uk. Please quote reference number 98/201 when applying. Closing date for applications: Monday 16th November 1998. Informal enquiries can be made by contacting Dr Emmanuel Pothos e.mail: e.pothos at bangor.ac.uk, or Professor Nick Chater e.mail: nick.chater at warwick.ac.uk Committed to Equal Opportunities From harnad at coglit.soton.ac.uk Tue Oct 13 04:25:50 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Tue, 13 Oct 1998 09:25:50 +0100 (BST) Subject: BBS Call (Neuron Doctrine) + 3 important announcements Message-ID: 3 important announcements, followed by BBS Call for Commentators (Gold/Stoljar: Neuron Doctrine): ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers in this last month. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers on their Home-Servers as well as on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ --------------------------------------------------------------------- Below is the abstract of a forthcoming BBS target article on: A NEURON DOCTRINE IN THE PHILOSOPHY OF NEUROSCIENCE by Ian Gold & Daniel Stoljar This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ A NEURON DOCTRINE IN THE PHILOSOPHY OF NEUROSCIENCE Ian Gold Institute of Advanced Studies, Australian National University, Canberra ACT 0200, Australia iangold at coombs.anu.edu.au Daniel Stoljar Department of Philosophy and Institute of Cognitive Science, University of Colorado, Boulder 80309 stoljar at colorado.edu and Institute of Advanced Studies, Australian National University Canberra ACT 0200, Australia dstoljar at coombs.anu.edu.au KEYWORDS: Churchlands, classical conditioning, cognitive neuroscience, Kandel, learning, materialism, mind, naturalism, neurobiology, neurophilosophy, philosophy of neuroscience, psychology, reduction, theoretical unification ABSTRACT: Many neuroscientists and philosophers endorse a view about the explanatory reach of neuroscience which we will call the neuron doctrine to the effect that the framework for understanding the mind will be developed by neuroscience; or, as we will put it, that a successful theory of the mind will be solely neuroscientific. It is a consequence of this view that the sciences of the mind that cannot be expressed by means of neuroscientific concepts alone count as indirect sciences that will be discarded as neuroscience matures. This consequence is what makes the doctrine substantive, indeed, radical. We ask, first, what the neuron doctrine means and, second, whether it is true. In answer to the first question, we distinguish two versions of the doctrine. One version, the trivial neuron doctrine, turns out to be uncontroversial but unsubstantive because it fails to have the consequence that the non-neuroscientific sciences of the mind will eventually be discarded. A second version, the radical neuron doctrine, does have this consequence, but, unlike the first doctrine, is highly controversial. We argue that the neuron doctrine appears to be both substantive and uncontroversial only as a result of a conflation of these two versions. We then consider whether the radical doctrine is true. We present and evaluate three arguments for it, based either on general scientific and philosophical considerations or on the details of neuroscience itself; arguing that all three fail. We conclude that the evidence fails to support the radical neuron doctrine. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.gold.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.gold ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.gold To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.gold When you have the file(s) you want, type: quit From bert at mbfys.kun.nl Tue Oct 13 08:29:51 1998 From: bert at mbfys.kun.nl (Bert Kappen) Date: Tue, 13 Oct 1998 14:29:51 +0200 (MET DST) Subject: PhD position available Message-ID: <199810131229.OAA10471@bertus.mbfys.kun.nl> PhD position for neural network research at SNN, University of Nijmegen, the Netherlands. Background: The SNN neural networks research group at the university of Nijmegen consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and graphical models. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focussed on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks; applications in medical diagnosis and prediction of consumer behaviour. Research project: The modern view on AI, neural networks as well as parts of statistics, is to describe learning and reasoning using a probabilistic framework. A particular advantage of the probabilistic framework is that domain knowledge in the form of rules and data can be easily combined in model construction. The main drawback is that inference and learning in large probabilistic networks is intractible. Therefore, robust approximation schemes are needed to apply this technology to large real world applications. The topic of research is to develop learning rules for neural networks and graphical models using techniques from statistical mechanics. Requirements: The candidate should have a strong background in theoretical physics or mathematics. The PhD position: Appointment will be full-time for four years. Gross salary will be NLG 2184 per month in the first year, increasing to NLG 3899 in the fourth year. More information: Details about the research can be found at http://www.mbfys.kun.nl/SNN or by contacting dr. H.J. Kappen (bert at mbfys.kun.nl, ++31243614241). Applications should include a curriculum vitae and a statement of the candidate's professional interests and goals, and one copy of recent work (e.g., thesis, article). Interested applicants should express their interest by email to bert at mbfys.kun.nl before October 17. Full applications should be sent before October 25 to the Personnel Department of the Faculty of Natural Sciences, University of Nijmegen, Toernooiveld 1, 6525 ED Nijmegen, vacancy number 98-52. From alimoglu at cns.bu.edu Tue Oct 13 16:20:39 1998 From: alimoglu at cns.bu.edu (Fevzi Alimoglu) Date: Tue, 13 Oct 1998 16:20:39 -0400 (EDT) Subject: Call For Papers: TAINN'99 Message-ID: CALL FOR PAPERS TAINN'99 The Eighth Turkish Symposium on Artificial Intelligence and Neural Networks June 23-25 1999 Bogazici University, Istanbul, TURKEY ORGANIZED BY: Bogazici University BACKGROUND: TAINN^Ò99 is the eighth in a series of symposia intended to bring together researchers from the fields of artificial intelligence and neural networks. The symposium will be held in the historical city of Istanbul, which has been the capital of two empires, and is the economic and cultural center of modern Turkey. SCOPE: AI Architectures, Artificial Life, Automated Modeling, Automated Reasoning, Bayesian Learning and Belief Networks, Control, Causality, Cognitive Modeling, Common Sense Reasoning, Computer-Aided Education, Decision Trees, Design, Diagnosis, Discourse, Discovery, Distributed AI, Expert Systems, Fuzzy Logic, Game Playing, Genetic Algorithms, Geometric or Spatial Reasoning, Hardware Realizations of Neural Networks and Fuzzy Systems, Information Retrieval, Knowledge Acquisition, Knowledge Representation, Logic Programming, Machine Discovery, Machine Learning, Machine Translation, Mathematical Foundations, Multi-agent Systems, Multimedia, Natural Language Processing, Neural Networks, Non-monotonic Reasoning, Pattern Recognition, Perception, Philosophical Foundations, Planning, Problem Solving, Qualitative Reasoning, Real-Time Systems, Reinforcement Learning, Robotics, Scheduling, Search, Simulation, Software Agents, Speech Understanding, Symbolic Computation, Temporal Reasoning, Vision. PAPER SUBMISSION: Original papers from all areas listed above are solicited. The symposium languages are Turkish and English. All submitted papers will be refereed on the basis of quality, significance, and clarity by program committee members and other reviewers. All accepted papers will be published in the symposium proceedings which will be available at the meeting. The symposium will consist of technical presentations, invited talk(s) and poster sessions. Prospective authors are invited to submit full papers exclusively in electronic media in the format of postscript files attached to e-mail messages. Papers are limited to 10 pages, including figures and references, in single-spaced one-column format using a font size of 11 points. See the symposium webpage for style information. PROGRAM COMMITTEE: L. Akarun Turkey H. L. Akin Turkey V. Akman Turkey F. Alpaslan Turkey E. Alpaydin Turkey V. Atalay Turkey I. Aybay N. Cyprus C. Bozsahin Turkey I. Bratko Slovenia I. Cicekli Turkey T. Ciftcibasi N. Cyprus D. Davenport Turkey C. Dichev Bulgaria G. W. Ernst USA A. Fatholahzadeh France M. Guler Turkey F. Gurgen Turkey H. A. Guvenir Turkey C. Guzelis Turkey U. Halici Turkey H. Hamburger USA S. Kocabas Turkey F. Masulli Italy K. Oflazer Turkey E. Oztemel Turkey Y. Ozturk USA J.-I. Tsujii UK, Japan G. Ucoluk Turkey N. Yalabik Turkey F. T. Yarman-Vural Turkey A. Yazici Turkey PROGRAM CO-CHAIRS: A. C. Cem Say say at boun.edu.tr Gunhan Dundar dundar at boun.edu.tr DEADLINES: Submission: February 1, 1999 Notification: April 5, 1999 Final Copy Due: May 3, 1999 For more information, frequently access the URL: http://www.cmpe.boun.edu.tr/~tainn99/ CONTACTS: TAINN'99 Department of Computer Engineering Boðaziçi University, Bebek 80815, Ýstanbul, TURKEY e-mail: say at boun.edu.tr phone: 90-212-2631540 Ext. 1628 (leave message) fax: 90-212-2872461 From skumar at cs.utexas.edu Wed Oct 14 11:41:53 1998 From: skumar at cs.utexas.edu (Shailesh Kumar) Date: Wed, 14 Oct 1998 10:41:53 -0500 (CDT) Subject: thesis on Q-learning for packet routing Message-ID: <199810141541.KAA22300@plucky.cs.utexas.edu> Dear Connectionists, My thesis on packet routing with Q-learning is available at the UTCS Neural Networks Research Group web site http://www.cs.utexas.edu/users/nn. - Shailesh CONFIDENCE BASED DUAL REINFORCEMENT Q-ROUTING: AN ON-LINE ADAPTIVE NETWORK ROUTING ALGORITHM Shailesh Kumar Master's Thesis; Technical Report AI98-267, Department of Computer Sciences, The University of Texas at Austin (108 pages) FTP-host: ftp.cs.utexas.edu FTP-filename: pub/neural-nets/papers/kumar.msthesis.ps.Z http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#kumar.msthesis.ps.Z Efficient routing of information packets in dynamically changing communication networks requires that as the load levels, traffic patterns and topology of the network change, the routing policy also adapts. Making globally optimal routing decisions would require a central observer/controller with complete information about the state of all nodes and links in the network, which is not realistic. Therefore, the routing decisions must be made locally by individual nodes (routers) using only local routing information. The routing information at a node could be estimates of packet delivery time to other nodes via its neighbors or estimates of queue lengths of other nodes in the network. An adaptive routing algorithm should efficiently explore and update routing information available at different nodes as it routes packets. It should continuously evolve efficient routing policies with minimum overhead on network resources. In this thesis, an on-line adaptive network routing algorithm called Confidence-based Dual Reinforcement Q-Routing (CDRQ-routing), based on the Q learning framework, is proposed and evaluated. In this framework, the routing information at individual nodes is maintained as Q value estimates of how long it will take to send a packet to any particular destination via each of the node's neighbors. These Q values are updated through exploration as the packets are transmitted. The main contribution of this work is the faster adaptation and the improved quality of routing policies over the Q-Routing. The improvement is based on two ideas. First, the quality of exploration is improved by including a confidence measure with each Q value representing how reliable the Q value is. The learning rate is a function of these confidence values. Secondly, the quantity of exploration is increased by including backward exploration into Q learning. As a packet hops from one node to another, it not only updates a Q value in the sending node (forward exploration similar to Q-Routing), but also updates a Q value in the receiving node using the information appended to the packet when it is sent out (backward exploration). Thus two Q value updates per packet hop occur in CDRQ-Routing as against only one in Q-routing. Certain properties of forward and backward exploration that form the basis of these update rules are stated and proved in this work. Experiments over several network topologies, including a 36 node irregular grid and 128 node 7-D hypercube, indicate that the improvement in quality and increase in quantity of exploration contribute in complementary ways to the performance of the overall routing algorithm. CDRQ-Routing was able to learn optimal shortest path routing at low loads and efficient routing policies at medium loads almost twice as fast as Q-Routing. At high load levels, the routing policy learned by CDRQ-Routing was twice as good as that learned by Q-Routing in terms of average packet delivery time. CDRQ-Routing was found to adapt significantly faster than Q-Routing to changes in network traffic patterns and network topology. The final routing policies learned by CDRQ-Routing were able to sustain much higher load levels than those learned by Q-Routing. Analysis shows that exploration overhead incurred in CDRQ-Routing is less than 0.5% of the packet traffic. Various extensions of CDRQ-Routing namely, routing in heterogeneous networks (different link delays and router processing speeds), routing with adaptive congestion control (in case of finite queue buffers) and including predictive features into CDRQ-Routing have been proposed as future work. CDRQ-Routing is much superior and realistic than the state of the art distance vector routing and the Q-Routing algorithm. From moatl at cs.tu-berlin.de Thu Oct 15 12:38:39 1998 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Thu, 15 Oct 1998 18:38:39 +0200 Subject: PhD-position available Message-ID: <3626250F.69F3@cs.tu-berlin.de> COMPUTATIONAL NEUROSCIENCE PhD-POSITION AVAILABLE Dept. of Computer Science Technical University of Berlin A PhD-position in the field of computational neuroscience is now available at the neural information processing group (Prof. Klaus Obermayer), Dept. of Computer Science of the Technical University of Berlin. The project aims at the development of new neural and statistical methods for the analysis of data sets generated by optical imaging of intrinsic signals. This method utilizes neural-activity related changes in the light reflectance of neural tissue, which are mediated by variations in local blood-oxygenation, blood-flow, blood volume and light scattering. The raw data sets contain a superposition of several signal components, only some of which are related to neural activity with a sufficiently high spatial resolution. The methods to be developed include: (i) Theoretical studies of the image generation process by Monte-Carlo simulations of light propagation in neuronal tissue. (ii) Development and application of methods for adaptive denoising of the data sets and the extraction of stimulus-related signals from these data sets. Requirements: The candidate should have a degree in physics, mathematics, engineering or computer science and a strong background in theory. Appointment: The appointment will initially be for two years but can be extended by one more year. The salary is determined according to BAT IIa/2 corresponding to a net salary (after deduction of taxes) of approx. DM 1700.-- per month. More information: Details about the research can be found at http://www.ni.cs.tu-berlin.de or by contacting Prof. Klaus Obermayer (oby at cs.tu-berlin.de, ++49-30-314-73120). Applications should include CV, list of publications, a statement of the candidate's professional interests and goals, and one copy of recent work (e.g., thesis, article). Applicants should express their interest by email to oby at cs.tu-berlin.de before November 15. Full applications should be sent before November 30. 1998 to Prof. Dr. Klaus Obermayer, FR2-1, FB 13, Informatik, Technische Universitaet Berlin Franklinstrasse 28/29 D-10587 Berlin Germany From raetsch at zoo.brain.riken.go.jp Fri Oct 16 07:32:06 1998 From: raetsch at zoo.brain.riken.go.jp (raetsch@zoo.brain.riken.go.jp) Date: Fri, 16 Oct 1998 20:32:06 +0900 (JST) Subject: TR on Soft Margins for AdaBoost Message-ID: Dear Connectionists, A new paper on a Soft Margin approach for AdaBoost is available: ``Soft Margins for AdaBoost'', G. R\"atsch, T. Onoda, K.-R. M\"uller, NeuroColt2 TechReport NC-TR-1998-021. http://www.first.gmd.de/~raetsch/Neurocolt_Margin.ps.gz Comments and questions are welcome. Please contact me at raetsch at first.gmd.de. Thanks, Gunnar R\"atsch Abstract: Recently ensemble methods like AdaBoost were successfully applied to character recognition tasks, seemingly defying the problems of overfitting. This paper shows that although AdaBoost rarely overfits in the low noise regime it clearly does so for higher noise levels. Central for understanding this fact is the margin distribution and we find that AdaBoost achieves -- doing gradient descent in an error function with respect to the margin -- asymptotically a {\em hard margin} distribution, i.e. the algorithm concentrates its resources on a few hard-to-learn patterns (here an interesting overlap emerge to Support Vectors). This is clearly a sub-optimal strategy in the noisy case. We propose several regularization methods and generalizations of the original AdaBoost algorithm to achieve a Soft Margin -- a concept known from Support Vector learning. In particular we suggest (1) regularized AdaBoost$_{Reg}$ using the soft margin directly in a modified loss function and (2) regularized linear and quadratic programming (LP/QP-) AdaBoost, where the soft margin is attained by introducing slack variables. Extensive simulations demonstrate that the proposed regularized AdaBoost algorithms are useful and competitive for noisy data. From mkearns at research.att.com Fri Oct 16 16:14:11 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Fri, 16 Oct 1998 16:14:11 -0400 (EDT) Subject: NIPS*98 Program Message-ID: <199810162014.QAA24117@radish.research.att.com> Appended below is the full program for the main conference of NIPS*98, from Program Chair Sara Solla. It will also be posted shortly to the NIPS web site at http://www.cs.cmu.edu/Groups/NIPS/ At the web site you can also find detailed information about the tutorials preceding the main conference, and the workshops in Breckenride immediately following the main conference. Please be aware that on-line registration is available at the web site, and that the deadline for early registration is October 30. Michael Kearns NIPS*98 General Chair ********************************************************************** Program for NIPS 1998 Neural Information Processing Systems SUN NOV 29 ---------- 18:00-22:00 Registration MON NOV 30 ---------- 08:30-18:00 Registration 09:30-17:30 Tutorials 18:30 Reception and Conference Banquet 20:30 The laws of the WEB (Banquet talk) B. Huberman Xerox PARC TUE DEC 1 --------- Oral Session 1: 08:30 Statistics of visual images: neural representation and synthesis (Invited) E. Simoncelli New York University 09:20 Attentional modulation of human pattern discrimination psychophysics reproduced by a quantitative model (VS, Oral) L. Itti, J. Braun, D. Lee, C. Koch California Institute of Technology 09:40 Orientation, scale, and discontinuity as emergent properties of illusory contour shape (VS, Oral) K. Thornber, L. Williams NEC Research Institute, University of New Mexico 10:00 DTs: dynamic trees (AA, Spotlight) C. Williams, N. Adams Aston University Modeling stationary and integrated time series with autoregressive neural networks (LT, Spotlight) F. Leisch, A. Trapletti, K. Hornik Technical University of Vienna Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages (LT, Spotlight) W. Maass, E. Sontag Technical University of Graz, Rutgers University Semiparametric support vector and linear programming machines (AA, Spotlight) A. Smola, T. Friess, B. Schoelkopf GMD FIRST Blind separation of filtered source using state-space approach (AA, Spotlight) L. Zhang, A. Cichocki RIKEN Brain Science Institute 10:15-11:00 Break Oral Session 2: 11:00 The bias-variance tradeoff and the randomized GACV (AA, Oral) G. Wahba, X. Lin, F. Gao, D. Xiang, R. Klein, B. Klein University of Wisconsin-Madison, SAS Institute 11:20 Kernel PCA and de-noising in feature spaces (AA, Oral) S. Mika, B. Schoelkopf, A. Smola, K. Mueller, M Scholz, G. Raetsch GMD FIRST 11:40 Sparse code shrinkage: denoising by maximum likelihood estimation (AA162, Oral) A. Hyvaarinen, P. Hoyer, E. Oja Helsinki University of Technology 12:00-14:00 Lunch Oral Session 3: 14:00 Temporally asymmetric Hebbian learning, spike timing and neuronal response variability (Invited) L. Abbott Brandeis University 14:50 Information maximization in single neurons (NS, Oral) M. Stemmler, C. Koch California Institute of Technology 15:10 Multi-electrode spike sorting by clustering transfer functions (NS, Oral) D. Rinberg, H. Davidowitz, N. Tishby NEC Research Institute 15:30 Distributional population codes and multiple motion models (NS, Spotlight) R. Zemel, P. Dayan University of Arizona, Massachusetts Institute of Technology Population coding with correlated noise (NS, Spotlight) H. Yoon, H. Sompolinsky Hebrew University Bayesian modeling of human concept learning (CS, Spotlight) J. Tenenbaum Massachusetts Institute of Technology Mechanisms of generalization in perceptual learning (CS, Spotlight) Z. Liu, D. Weinshall NEC Research Institute, Hebrew University An entropic estimator for structure discovery (SP, Spotlight) M. Brand Mitsubishi Electric Research Laboratory 15:45-16:15 Break Oral Session 4: 16:15 The role of lateral cortical competition in ocular dominance development (NS, Oral) C. Piepenbrock, K. Obermayer Technical University of Berlin 16:35 Evidence for learning of a forward dynamic model in human adaptive control (CS, Oral) N. Bhushan, R. Shadmehr Johns Hopkins University 16:55-18:00 Poster Preview 19:30 Poster Session WED DEC 2 --------- Oral Session 5: 08:30 Computation by Cortical Modules (Invited) H. Sompolinsky Hebrew University 09:20 Learning curves for Gaussian processes (LT, Oral) P. Sollich University of Edinburgh 9:40 Mean field methods for classification with Gaussian processes (LT, Oral) M. Opper O. Winther Aston University, Niels Bohr Institute 10:00 Dynamics of supervised learning with restricted training sets (LT, Spotlight) A. Coolen, D. Saad King's College London, Aston University Finite-dimensional approximation of Gaussian processes (LT, Spotlight) G. Trecate, C. Williams, M. Opper University of Pavia, Aston University Inference in multilayer networks via large deviation bounds (LT, Spotlight) M. Kearns, L. Saul AT&T Labs Gradient descent for general reinforcement learning (CN, Spotlight) L. Baird, A. Moore Carnegie Mellon University Risk sensitive reinforcement learning (CN, Spotlight) R. Neuneier, O. Mihatsch Siemens AG 10:15-11:00 Break Oral Session 6: 11:00 VLSI implementation of motion centroid localization for autonomous navigation (IM, Oral) R. Etienne-Cummings, M. Ghani, V. Gruev Southern Illinois University 11:20 Improved switching among temporally abstract actions (CN, Oral) R. Sutton, S. Singh, D. Precup, B. Ravindran University of Massachusetts, University of Colorado 11:40 Finite-sample convergence rates for Q-learning and indirect algorithms (CN, Oral) M. Kearns, S. Singh AT&T Labs, University of Colorado 12:00-14:00 Lunch Oral Session 7: 14:00 Statistical natural language processing: better living through floating-point numbers (Invited) E. Charniak Brown University 14:50 Markov processes on curves for automatic speech recognition (SP, Oral) L. Saul, M. Rahim AT&T Labs 15:10 Approximate learning of dynamic models (AA, Oral) X. Boyen, D. Koller Stanford University 15:30 Learning nonlinear stochastic dynamics using the generalized EM algorithm (AA, Spotlight) Z. Ghahramani, S. Roweis University of Toronto, California Institute of Technology Reinforcement learning for trading systems (AP, Spotlight) J. Moody, M. Saffell Oregon Graduate Institute Bayesian modeling of facial similarity (AP, Spotlight) B. Moghaddam, T. Jebara, A. Pentland Mitsubishi Electric Research Laboratory, Massachusetts Institute of Technology Computation of smooth optical flow in a feedback connected analog network (IM, Spotlight) A. Stocker, R. Douglas University and ETH Zurich Classification on pairwise proximity data (AA, Spotlight) T. Graepel, R. Herbrich, P. Bollmann-Sdorra, K. Obermayer Technical University of Berlin 15:45-16:15 Break Oral Session 8: 16:15 Learning from dyadic data (AA, Oral) T. Hofmann, J. Puzicha, M. Jordan Massachusetts Institute of Technology, University of Bonn 16:35 Classification in non-metric spaces (VS, Oral) D. Weinshall, D. Jacobs, Y. Gdalyahu NEC Research Institute, Hebrew University 16:55-18:00 Poster Preview 19:30 Poster Session THU DEC 3 --------- Oral Session 9: 08:30 Convergence of the wake-sleep algorithm (LT, Oral) S. Ikeda, S. Amari, H. Nakahara RIKEN Brain Science Institute 08:50 Learning a continuous hidden variable model for binary data (AA, Oral) D. Lee, H. Sompolinsky Bell Laboratories, Hebrew University 09:10 Direct optimization of margins improves generalization in combined classifiers (LT, Oral) L. Mason, P. Bartlett, J. Baxter Australian National University 09:30 A polygonal line algorithm for constructing principal curves (AA, Oral) B. Kegl, A. Krzyzak, T. Linder, K. Zeger Concordia University, Queen's University, UC San Diego 09:50-10:30 Break Oral Session 10: 10:30 Fast neural network emulation of physics-based models for computer animation (AP, Oral) R. Grzeszczuk, D. Terzopoulos, G. Hinton Intel Corporation, University of Toronto 10:50 Graphical models for recognizing human interactions (AP, Oral) N. Oliver, B. Rosario, A. Pentland Massachusetts Institute of Technology 11:10 Things that think (Invited) N. Gershenfeld Massachusetts Institute of Technology 12:00 End of main conference POSTERS ------- A theory of mean field approximation (LT, Poster) T. Tanaka Tokyo Metropolitan University Replicator equations, maximal cliques, and graph isomorphism (AA, Poster) M. Pelillo University of Venice Almost linear VC dimension bounds for piecewise polynomial networks (LT, Poster) P. Bartlett, V. Maiorov, R. Meir Australian National University, Technion On the optimality of incremental neural network algorithms (LT, Poster) R. Meir, V. Maiorov Technion Controlling the complexity of HMM systems by regularization (SP, Poster) C. Neukirchen, G. Rigoll Gerhard-Mercator-University Duisburg The belief in TAP (LT, Poster) Y. Kabashima, D. Saad Tokyo Institute of Technology, Aston University Source separation as a by-product of regularization (AA, Poster) S. Hochreiter, J. Schmidhuber Technical University of Munich, IDSIA Optimizing admission control while ensuring quality of service in multimedia networks via reinforcement learning (CN, Poster) T. Brown, H. Tong, S. Singh University of Colorado SMEM algorithm for mixture models (AA, Poster) N. Ueda, R. Nakano, Z. Ghahramani, G. Hinton NTT Communication Science Laboratories, University of Toronto Neuronal regulation implements efficient synaptic pruning (NS, Poster) G. Chechik, I. Meilijson, E. Ruppin Tel-Aviv University Reinforcement learning based on on-line EM algorithm (CN, Poster) M. Sato, S. Ishii ATR Human Information Processing Research Laboratories, Nara Institute of Science and Technology Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex (NS, Poster) P. Adorjan, K. Obermayer Technical University of Berlin Probabilistic visualization of high-dimensional binary data (AA, Poster) M. Tipping Microsoft Research Bayesian PCA (AA, Poster) C. Bishop Microsoft Research General-purpose localization of textured image regions (VS, Poster) R. Rosenholtz Xerox PARC Independent component analysis of intracellular calcium spike data (AP, Poster) K. Prank, J.Boerger, A. von zur Muehlen, G. Brabant, C. Schoefl Medical School Hannover Visualizing group structure (AA, Poster) M. Held, J. Puzicha, J. Buhmann University of Bonn Unsupervised clustering: the mutual information between parameters and observations (LT, Poster) D. Herschkowitz, J-P. Nadal Ecole Normale Superieure Using statistical properties of a labelled visual world to estimate scenes (VS, Poster) W. Freeman, E. Pasztor Mitsubishi Electric Research Laboratory Heeger's normalization, line attractor network and ideal observers (NS, Poster) S. Deneve, A. Pouget, P. Latham Georgetown University Learning Lie transformation groups for invariant visual perception (VS, Poster) R. Rao, D. Rudermann The Salk Institute Using collective intelligence to route Internet traffic (AP, Poster) D. Wolpert, K. Tumer, J. Frank NASA Ames Research Center Synergy and redundancy among brain cells of behaving monkeys (NS, Poster) I. Gat, N. Tishby Hebrew University, NEC Research Institute Where does the population vector of motor cortical cells point during reaching movements? (NS, Poster) P. Baraduc, E. Guigon, Y. Burnod Universite Pierre et Marie Curie Lazy learning meets the recursive least squares algorithm (AA, Poster) M. Birattari, G. Bontempi, H. Bersini Universite Libre de Bruxelles Variational approximations of graphical models using undirected graphs (LT, Poster) D. Barber, W. Wiegerinck University of Nijmegen Making templates rotationally invariant: an application to rotated digit recognition (AP, Poster) S. Baluja Carnegie Mellon University Probabilistic modeling for face orientation discrimination: learning from labeled and unlabeled data (AP, Poster) S. Baluja Carnegie Mellon University Maximum-likelihood continuity mapping (MALCOM): an alternative to HMMs (SP, Poster) D. Nix, J. Hogden Los Alamos National Laboratory Signal detection in noisy weakly-active dendrites (NS, Poster) A. Manwani, C. Koch California Institute of Technology An integrated vision sensor for the computation of optical flow singular points (IM, Poster) C. Higgins, C. Koch California Institute of Technology A micropower CMOS adaptive amplitude and shift invariant vector quantizer (IM, Poster) R. Coggins, R. Wang, M. Jabri University of Sidney Coding time-varying signals using sparse, shift-invariant representations (SP, Poster) M. Lewicki, T. Sejnowski The Salk Institute Unsupervised classification with non-Gaussian mixture models using ICA (AA, Poster) T-W. Lee, M. Lewicki, T. Sejnowski The Salk Institute Discovering hidden features with Gaussian processes regression (AA, Poster) F. Vivarelli, C. Williams Aston University Adding constrained discontinuities to Gaussian process models of wind fields (AP, Poster) D. Cornford, I. Nabney, C. Williams Aston University General bounds on Bayes errors for regression with Gaussian processes (LT, Poster) M. Opper, F. Vivarelli Aston University Efficient Bayesian parameter estimation in large discrete domains (AA, Poster) N. Friedman, Y. Singer UC Berkeley, AT&T Labs Support vector machines applied to face recognition (VS, Poster) J. Phillips National Institute of Standards and Technology A principle for unsupervised hierarchical decomposition of visual scenes (CS, Poster) M. Mozer University of Colorado Dynamically adapting kernels in support vector machines (LT, Poster) N. Cristianini, C. Campbell, J. Shawe-Taylor University of Bristol, University of London Graph matching for shape retrieval (AP, Poster) B. Huet, A. Cross, E. Hancock University of York Experiments with a memoryless algorithm which learns locally optimal stochastic policies for partially observable Markov decision processes J. Williams, S. Singh University of Colorado Basis selection for wavelet regression (AA, Poster) K. Wheeler NASA Ames Research Center Non-linear PI control inspired by biological control systems (CN, Poster) L. Brown, G. Gonye, J. Schwaber E.I. DuPont deNemours Utilizing time: asynchronous binding (CS, Poster) B. Love Northwestern University Discontinuous recall transitions induced by competition between short- and long-range interactions in recurrent networks (LT, Poster) N. Skantzos, C. Beckmann, A. Coolen King's College London On-line learning with restricted training sets: exact solution as benchmark for general theories (LT, Poster) H. Rae, P. Sollich, A. Coolen King's College London, University of Edinburgh Phase diagram and storage capacity of sequence storing neural networks (LT, Poster) A. Duering, A. Coolen, D. Sherrington Oxford University, King's College London Exploratory data analysis using radial basis function latent variable models (AA, Poster) A. Marrs, A. Webb DERA Spike-based compared to rate-based Hebbian learning (NS, Poster) R. Kempter, W. Gerstner, L. van Hemmen Technical University of Munich, Swiss Federal Institute of Technology Using analytic QP and sparseness to speed training of support vector machines (AA, Poster) J. Platt Microsoft Research Optimizing classifiers for imbalanced training sets (LT, Poster) G. Karakoulas, J. Shawe-Taylor Canadian Imperial Bank of Commerce, University of London Classification with linear threshold functions and the linear loss (LT, Poster) C. Gentile, M. Warmuth University of Milan, UC Santa Cruz Barycentric interpolators for continuous space & time reinforcement learning (CN, Poster) R. Munos, A. Moore Carnegie Mellon University Learning instance-independent value functions to enhance local search (CN, Poster) R. Moll, A. Barto, T. Perkins, R. Sutton University of Massachusetts Scheduling straight-line code using reinforcement learning and rollouts (AP, Poster) A. McGovern, E. Moss University of Massachusetts Global optimization of neural network models via sequential sampling (AA, Poster) J. de Freitas, M. Niranjan, A. Doucet, A. Gee Cambridge University Active noise canceling using analog neuro-chip with on-chip learning capability (IM, Poster) J-W. Cho, S-Y. Lee Korea Advanced Institute of Science and Technology A reinforcement learning algorithm in partially observable environments using short-term memory (CN, Poster) N. Suematsu, A. Hayashi Hiroshima City University Tight bounds for the VC-dimension of piecewise polynomial networks (LT, Poster) A. Sakurai Japan Advanced Institute of Science and Technology Applications of multi-resolution neural networks to mammography (AP, Poster) P. Sajda, C. Spence Sarnoff Corporation Restructuring sparse high dimensional data for effective retrieval (AA, Poster) C. Isbell, P. Viola Massachusetts Institute of Technology Vertex identification in high energy physics experiments (AP, Poster) G. Dror, H. Abramowicz, D. Horn The Academic College of Tel-Aviv-Yaffo, Tel-Aviv University A model for associative multiplication (CS, Poster) G. Christianson, S. Becker McMaster University Robot docking using mixtures of Gaussians (AP, Poster) M. Williamson, R. Murray-Smith, V. Hansen Massachusetts Institute of Technology, Technical University of Denmark, Daimler-Benz Computational differences between asymmetrical and symmetrical networks (LT, Poster) Z. Li, P. Dayan Massachusetts Institute of Technology A V1 model of pop out and asymmetry in visual search (VS, Poster) Z. Li Massachusetts Institute of Technology Modifying the parti-game algorithm for increased robustness, higher efficiency and better policies (CN, Poster) M. Al-Ansari, R. Williams Northeastern University Hierarchical ICA belief networks (AA, Poster) H. Attias UC San Francisco Exploring unknown environments with real-time heuristic search (CN, Poster) S. Koenig Georgia Institute of Technology Multiple paired forward-inverse models for human motor learning and control (CS, Poster) M. Haruno, D. Wolpert, M. Kawato ATR Human Information Processing Research Laboratories, University College London Learning to find pictures of people (VS, Poster) S. Ioffe, D. Forsyth UC Berkeley Facial memory is kernel density estimation (almost) (CS, Poster) M. Dailey, G. Cottrell, T. Busey UC San Diego, Indiana University Image statistics and cortical normalization models (NS, Poster) E. Simoncelli, O. Schwartz New York University Probabilistic sensor fusion (VS, Poster) R. Sharma, T. Leen, M. Pavel Oregon Graduate Institute Semi-supervised support vector machines (AA, Poster) K. Bennett, A. Demiriz Rensselaer Polytechnic Institute Optimizing correlation algorithms for hardware-based transient classification (IM, Poster) R. Edwards, G. Cauwenberghs, F. Pineda Johns Hopkins University On-line and batch parameter estimation of Gaussian mixtures based on the relative entropy (AA, Poster) Y. Singer, M. Warmuth AT&T Labs, UC Santa Cruz Very fast EM-based mixture model clustering using multiresolution kd-trees (AA, Poster) A. Moore Carnegie Mellon University Perceiving without learning: from spirals to inside/outside relations (CS, Poster) K. Chen, D. Wang Ohio State University A high performance k-NN classifier using a binary correlation matrix memory (IM, Poster) P. Zhou, J. Austin, J. Kennedy University of York Neural networks for density estimation (AA, Poster) M. Magdon-Ismail, A. Atiya California Institute of Technology Coordinate transformation learning of hand position feedback controller by using change of position error norm (CN, Poster) E. Oyama, S. Tachi University of Tokyo A randomized algorithm for pairwise clustering (AA, Poster) Y. Gdalyahu, D. Weinshall, M. Werman Hebrew University Familiarity discrimination of radar pulses (AP, Poster) E. Granger, S. Grossberg, M. Rubin, W. Streilein Ecole Polytechnique de Montreal, Boston University GLS: a hybrid classifier system based on POMDP research (CN, Poster) A. Hayashi, N. Suematsu Hiroshima City University Visualizing and analyzing single-trial event-related potentials (NS, Poster) T-P. Jung, S. Makeig, M. Westerfield, J. Townsend, E. Courchesne, T. Sejnowski The Salk Institute, Naval Health Research Center, UC San Diego Example based image synthesis of articulated figures (VS, Poster) T. Darrell Interval Research Maximum conditional likelihood via bound maximization and the CEM algorithm (AA, Poster) T. Jebara, A. Pentland Massachusetts Institute of Technology Boxlets: a fast convolution algorithm for signal processing and neural networks (AA, Poster) P. Simard, L. Bottou, P. Haffner, Y. LeCun AT&T Labs Learning multi-class dynamics A. Blake, B. North, M. Isard Oxford University Fisher scoring and a mixture of modes approach for approximate inference and learning in nonlinear state space models T. Briegel, V. Tresp Siemens AG Shrinking the tube: a new support vector regression algorithm (LT, Poster) B. Schoelkopf, P. Bartlett, A. Smola, R. Williamson GMD FIRST, Australian National University Regularizing AdaBoost (AA, Poster) G. Raetsch, T. Onoda, K. Mueller GMD FIRST A neuromorphic monaural sound localizer (IM, Poster) J. Harris, C-J. Pu, J. Principe University of Florida Convergence rates of algorithms for perceptual organization: detecting visual contours (AA, Poster) A. Yuille, J. Coughlan Smith-Kettlewell Eye Research Institute Minutemax: a fast approximation for minimax learning (VS, Poster) J. Coughlan, A. Yuille Smith-Kettlewell Eye Research Institute Call-based fraud detection in mobile communication networks using a hierarchical regime-switching model (AP, Poster) J. Hollmen, V. Tresp Helsinki University of Technology, Siemens AG The effect of eligibility traces on finding optimal memoryless policies in partially observable Markovian decision processes (CN, Poster) J. Loch University of Colorado Exploiting generative models in discriminative classifiers T. Jaakkola, D. Haussler UC Santa Cruz Complex cells as cortically amplified simple cells (NS, Poster) F. Chance, S. Nelson, L. Abbott Brandeis University Least absolute shrinkage is equivalent to quadratic penalization (AA, Poster) Y. Grandvalet, S. Canu Universite de Technologie de Compiegne Analog VLSI cellular implementation of the boundary contour system (IM, Poster) G. Cauwenberghs, J. Waskiewicz Johns Hopkins University Learning mixture hierarchies (AA, Poster) N. Vasconcelos, A. Lippman Massachusetts Institute of Technology Learning macro-actions in reinforcement learning (CN, Poster) J. Randlov Niels Bohr Institute From Dave_Touretzky at cs.cmu.edu Sun Oct 18 19:41:57 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Sun, 18 Oct 1998 19:41:57 -0400 Subject: CONNECTIONISTS policy on spam (unsolicited commercial email) Message-ID: <90.908754117@skinner.boltz.cs.cmu.edu> Recently I have seen a rash of conference announcements appearing in my personal mailbox, often from conferences I've never heard of before. There seems to be a trend, mainly in Europe but also sometimes in the US, to advertise academic conferences by spamming, as well as by the more socially acceptable means of Usenet newsgroup postings and messages to private mailing lists. The policy of the CONNECTIONISTS list is that we will not accept any announcements for conferences that engage in email spamming of personal mailboxes. Let me be clear about what "spamming" means. There is nothing wrong with a conference organizer sending email announcements to persons who have an established relationship with that conference, i.e., attendees from prior years, authors of papers submitted to the conference, or anyone who has written the conference asking for information. However, if a conference organizer uses the email addresses of people who have no prior relationship to the conference, e.g., people who merely posted to a newsgroup like comp.ai.neural-nets, or mentioned neural networks on their personal home page, or are known in the field from their publications in other conferences, that is spamming. Those people have not given even implicit permission to place their names on a mailing list. Sending unsolicited mail to such people is antisocial behavior, even if the message contains instructions for how to get off the mailisg list. The Connectionists list operates with a "zero tolerance" policy toward spamming. One of the reasons the list is moderated is the need to filter spam. Neural network conference announcements are NOT spam when sent to this mailing list; they are entirely apporpriate. But we will not cooperate with any conference that spams personal mailboxes. For our European friends: the spam problem is currently far worse in the US than in Europe. Bills have been proposed in the US Congress to make spamming illegal, but none have passed yet. It is very important that we not create loopholes for "acceptable" spam, such as academic conference announcements, as such loopholes will undermine the legal argument that sending ANY email solicitation to private individuals without their permission should be banned. (Due to freedom of speech considerations, if conference announcements are considered "okay", then porno spam and advertisements for financial scams also have to be allowed. And we don't want that.) Anyone who wants to learn more about the spam problem and how to fight it is invited to visit the following web sites: CAUCE -- Coalition Against Unsolicited Commercial Email http://www.cauce.org and especially http://www.cauce.org/resources.html Fight Spam on the Internet! http://spam.abuse.net/ -- Dave Touretzky, Connectionists moderator From Nigel.Goddard at ed.ac.uk Mon Oct 19 10:01:56 1998 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Mon, 19 Oct 1998 15:01:56 +0100 Subject: Computational Functional MRI positions Message-ID: <22599.199810191426@canna.dcs.ed.ac.uk> Please forward to interested individuals University of Edinburgh Centre for Interactive Image Analysis Research Fellows in Computational Functional MRI This is an exciting new venture to visualise the working of the human brain during cognitive tasks using a new research MR scanner. The Centre for Interactive Image Analysis is run by the SHEFC Brain Imaging Research Center for Scotland and the Institute for Adpative and Neural Computation at the University of Edinburgh. Working closely together, the two postholders will jointly engage in the following aspects of the Centre's work: * establish functional imaging on a research MRI scanner, including the stimulus presentation system, software and interfacing. * work with cognitive scientists and clinicians to define, establish and run suitable test paradigms on healthy volunteers and patients * establish networking, image-processing, analysis, visualisation and archiving facilities for scientific and clinical data obtained in fMRI studies. * work with cognitive scientists and clinicians to explore the uses of real-time and near-real-tiem analysis techniques in fMRI studies. The postholders will be encouraged to undertake original research in one or more of the following areas of fMRI: novel pulse sequences; statistical models of physical and physiological noise; cognitive modelling; parallel statistical and image-processing algorithms; visualisation techniques. Candidates should hold a higher degree in an appropriate subject and should be familiar with PC and/or Unix computing. Expertise in cognitive modelling, MRI, statistics, image analysis, and visualisation are highly desirable. At least one postholder will have experience with parallel computing on Unix platforms. Salary scale: 15,735-23,651 p.a. Both posts are tenable for 3 years initially with renewal subject to success in attracting further funding. For further details see http://www.cns.ed.ac.uk/ciia Closing date: 6 November 1998. Interviews are expected 16-20 November 1998. Further particulars and application details should be obtained from the Personnel Department, The University of Edinburgh, 1 Roxburgh Street, Edinburgh, EH8 9TB Tel. 0131 650 2511 (24 hour answering service) quoting reference number 896785. Informal enquiries may be made to: Dr Nigel Goddard Inst. for Adaptive and Neural Computation Tel: +44 131 650 3087 Fax: +44 131 650 6899 email: Nigel.Goddard at ed.ac.uk or to: Dr Ian Marshall Dept. Medical Physics Tel: +44 131 537 1661/2155 FAX: +44 131 537 1026 email: I.Marshall at ed.ac.uk From at at coglit.soton.ac.uk Mon Oct 19 12:41:33 1998 From: at at coglit.soton.ac.uk (Adriaan Tijsseling) Date: Mon, 19 Oct 1998 17:41:33 +0100 Subject: PhD Thesis now available as postscript (Conn. Models of Cat.) Message-ID: The following PhD thesis is available as postscript via http://cogito.psy.soton.ac.uk/~at/CALM/Title.ps http://cogito.psy.soton.ac.uk/~at/CALM/Abstract.ps http://cogito.psy.soton.ac.uk/~at/CALM/PhD.ps http://cogito.psy.soton.ac.uk/~at/CALM/Colors.ps Adriaan Tijsseling ------------------------ Abstract ------------------------ Connectionist Models of Categorization: A Dynamical View of Cognition by Adriaan Tijsseling The functional role of altered similarity structure in categorization is analyzed. 'Categorical Perception' (CP) occurs when equal-sized physical differences in the signals arriving at our sensory receptors are perceived as smaller within categories and larger between categories (Harnad, 1987). Our hypothesis is that it is by modifying the similarity between internal representations that successful categorization is achieved. This effect depends in part on the iconicity of the inputs, which induces a similarity preserving structure in the internal representations. Categorizations based on the similarity between stimuli are easier to learn than contra-iconic categorization; it is mainly to modify the latter in the service of categorization that the characteristic compression/separation of CP occurs. This hypothesis was tested in a series of neural net simulations of studies on category learning in human subject. The nets are first pre-exposed to the inputs and then given feedback on their performance. The behavior of the resulting networks was then analyzed and compared to human performance. Before it is given feedback, the network discriminates and categorizes input based on the inherent similarity of the input structure. With corrective feedback the net moves its internal representations away from category boundaries. The effect is that similarity of patterns that belong to different categories is decreased, while similarity of patterns from the same category is increased (CP). Neural net simulations make it possible to look inside a hypothetical black box of how categorization may be accomplished; it is shown how increased attention to one or more dimensions in the input and the salience of input features affect category learning. Moreover, the observed 'warping' of similarity space in the service of categorization can provide useful functionality by creating compact, bounded chunks (Miller, 1965) with category names that can then be combined into higher-order categories described by the symbol strings of natural language and the language of thought (Greco, Cangelosi, & Harnad, 1997). The dynamic models of categorization of the kind analyzed here can be extended to make them powerful models of neuro-symbolic processing (Casey, 1997) and a fruitful territory for future research. From harnad at coglit.soton.ac.uk Tue Oct 6 11:42:32 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Tue, 6 Oct 1998 16:42:32 +0100 (BST) Subject: Representation: Psycoloquy Call for Commentators Message-ID: Markman/Dietrich: Representation/Mediation The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY: There has been a growing sentiment in cognitive science, particularly among advocates of dynamical systems and situated action, that traditional approaches to representation should be abandoned. Calls for the elimination of representation, as well as previous attempts to defend representation, have unfortunately talked past each other for a lack of common ground. This target article attempts to provide a common ground for the debate over representation. Many proponents of representations are searching for a single representational system to serve as the basis of all cognitive models; this paper argues that multiple approaches to representation must coexist in cognitive models. We also hope to elicit discussion about what properties of representations are critical for cognitive models. Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.48 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.48.representation-mediation.1.markman ----------------------------------------------------------------------- psycoloquy.98.9.48.representation-mediation.1.markman Mon Oct 5 1998 ISSN 1055-0143 (68 paragraphs, 78 references, 1297 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1998 Arthur B. Markman & Eric Dietrich IN DEFENSE OF REPRESENTATION AS MEDIATION Arthur B. Markman Department of Psychology University of Texas Austin, TX 78712 markman at psy.utexas.edu http://www.psy.utexas.edu/psy/FACULTY/Markman/index.html Eric Dietrich PACCS Program in Philosophy Binghamton University Binghamton, NY dietrich at binghamton.edu http://www.binghamton.edu/philosophy/home/faculty/index.htm ABSTRACT: Some cognitive scientists have asserted that cognitive processing is not well modeled by classical notions of representation and process that have dominated psychology and artificial intelligence since the cognitive revolution. In response to this claim, the concept of a mediating state is developed. Mediating states are the class of information-carrying internal states used by cognitive systems, and as such are accepted even by those researchers who reject representations. The debate over representation, then, is actually one about what additional properties of mediating states are necessary for explaining cognitive processing. Five properties that can be added to mediating states are examined for their importance in cognitive models. KEYWORDS: compositionality, computation, connectionism, discrete states, dynamic Systems, explanation, information, meaning, mediating states, representation, rules, semantic Content symbols Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.48 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.48.representation-mediation.1.markman To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc From sue at soc.plym.ac.uk Tue Oct 20 10:15:00 1998 From: sue at soc.plym.ac.uk (Sue Denham) Date: Tue, 20 Oct 1998 15:15:00 +0100 Subject: Lecturership Post Available Message-ID: <1.5.4.32.19981020141500.0073f550@soc.plym.ac.uk> University of Plymouth School Of Computing Lecturer/Senior Lecturer in Computational Intelligence in Finance and Business Applications are invited for the above position. Candidates must have either: i) a PhD degree in the area of computational intelligence (neural computing or evolutionary computing) and an interest in, together with some experience of, the application of these computing technologies in finance, investment and business, or ii) a higher degree in a financial/business discipline and at least five years'experience of applying computational intelligence techniques in the financial/business sector. A record of research publication in this field is highly desirable. The person appointed will be expected: i) to lead the development of teaching and research in this field, working within the School's Centre for Neural and Adaptive Systems, and ii) to take a leading role in the design and promotion of a new MSc programme in Computational Intelligence in Finance and Business, which the School is planning to start within the next two years. Further details of the position are available from: Dr Sue Denham Centre for Neural and Adaptive Systems School of Computing University of Plymouth Plymouth PL4 8AA England tel: +44 17 52 23 26 10 fax: +44 17 52 23 25 40 e-mail: sue at soc.plym.ac.uk http://www.tech.plym.ac.uk/soc/research/neural/index.html From debodt at fin.ucl.ac.be Wed Oct 21 10:07:16 1998 From: debodt at fin.ucl.ac.be (de Bodt Eric) Date: Wed, 21 Oct 1998 16:07:16 +0200 Subject: ACSEG98 - Call for participation Message-ID: <043f01bdfcfc$1bc55fc0$84996882@pcdebodt.ucl.ac.be> CONNECTIONIST APPROACHES IN ECONOMICS AND MANAGEMENT SCIENCES FIFTH INTERNATIONAL MEETING COMPLEX DATA: MODELING AND ANALYSIS CALL FOR PARTICIPATION Since the beginning of the 80s, important advances have been made in developing diverse new approaches of bio-mimetic inspiration (neural nets, genetic algorithms, cellular automata.) These approaches are of prime interest for researcher both in Economics and in Management Sciences. The ACSEG International Meetings give the opportunity to assess the state-of-the-art in the domain, to delineate future developments, and to evidence the contribution of bio-mimetic methods to Economics and Management Sciences. They also allow the researchers to present their recent work, to exchange know-how, and to discuss the problems encountered in their research. The 1998 ASCEG International Meeting on Complex Data Modeling and Analysis Will take place at the Universite catholique de Louvain, November 20, 1998. Organizers are the research centers SAMOS (Universite de Paris 1 - Pantheon - Sorbonne), CEGF (Universite catholique de Louvain) and CeReFim (Facultes Universitaires Notre-Dame de la Paix). If interested, check the conference page http://mkb.fin.ucl.ac.be/Acseg98 (the list of accepted papers is presented as well as the program of the day) or write to: ACSEG98, Centre d'Etudes en Gestion Financiere, Institut d'Administration et de Gestion, Universite catholique de Louvain, 1 place des Doyens, 1348 Louvain-la-Neuve - Belgium (Fax. : + (32).10.47.83.24) for additional information. From d.mareschal at psychology.bbk.ac.uk Wed Oct 21 04:58:42 1998 From: d.mareschal at psychology.bbk.ac.uk (Denis Mareschal) Date: Wed, 21 Oct 1998 09:58:42 +0100 Subject: connectionist models of learning and development Message-ID: Dear all, Members of this list may be interested in the following new journal. There has recently been an increase in the amount of computational and especially connectionist modelling of learning and development in infancy and childhood. Many of the researchers in the field want to find a natural outlet for the publication of their work. Developmental Science is a journal that specifically solicits manuscripts reporting on computational and connectionist models of learning and development. Also of interest are empirical studies that test existing models. Apologies to those who may receive multiple copies of this message. Best Regards, Denis -------------------- INSERT TEXT -------------------------- The first issue of Developmental Science, the journal of the European Society for Developmental Psychology has now been published. Developmental Science represents the best of contemporary scientific developmental psychology both in the presentation of theory and in reporting new data, Developmental Science will include: * Comparative and biological perspectives * Connectionist and computational perspectives * Dynamical systems theory VOLUME 1, ISSUE 1 CONTENTS: Peer commentary article * Uniquely Primate, Uniquely Human - Michael Tomasello Peer commentaries on Tomasello: "Uniquely Primate, Uniquely Human" * Comment on Michael Tomasello's Uniquely Primate, Uniquely Human - Merlin Donald * The Shaping of Social Cognition in Evolution and Development - Andrew Whiten * Comment on "Uniquely Primate, Uniquely Human" - Marc D Hauser * Uniquely to what ends? - Jonas Langer * Simian Similarities, Schisms, and "Special Social Skills" - James R Anderson Reply by the author * Response to Commentators - Michael Tomasello Reports * A nonhuman primate's expectations about object motion and destination: the importance of self-propelled movement and animacy - Marc D Hauser * Development of Precision Grips in Chimpanzees - George Butterworth & Shoji Itakura * Development of Selective Attention in Young Infants: Enhancement and Attenuation of Startle Reflex by Attention - John E Richards * Visual attention in infants with perinatal brain damage: Evidence of the importance of anterior lesions - Mark H Johnson, Leslie A Tucker, Joan Stiles & Doris Trauner * Gravity Does Rule for Falling Events - Bruce M Hood * Imitation across Changes in Object Affordances and Social Context in 9-Month-Old Infants - Emmanuel Devouche * Object Individuation in Young Infants: Further Evidence with an Event-Monitoring Paradigm - Teresa Wilcox & Renee Baillargeon * Computational Evidence for the Foundations of Numerical Competence - Tony J Simon * Newborns Learn to Identify a Face in Eight/Tenths of a Second? - Gail Walton, Erika Armstrong & Thomas G R Bower Papers * Global Influences on the Development of Spatial and Object Perceptual Categorization Abilities: Evidence from Preterm Infants - Clay Mash, Paul C Quinn, Velma Dobson & Dana B Narter * A Computational Analysis of Conservation - Thomas R Shultz * We almost had a great future behind us: the contribution of non-linear dynamics to developmental-science-in-the-making - Paul van Geert VOLUME 1, ISSUE 2 CONTENTS: Article with peer commentary and a reply by the author. * Infant perseveration and implications for object permanence theories: A PDP model of the AB task - Y. Munakata Commentaries * Understanding the A not B error : Working memory vs reinforded response, or active trace vs latent trace - A. Diamond * Toward a general model of perseveration in infancy - R. Bailllargeon. & A. Aguiar * Commentary on Munakata - J. G. Bremner * The development of delayed response: parallel distributed processing lacks neural plausibility - S. Dehaene * On theory and modeling - J. Mandler * To reach or not to reach: that is the question - Denis Mareschal * Commentary on Munakata's theory of object permanence development - J. S. Reznick * Infant perseveration and implications for object permanence theories: A PDP model of the A not B task - J. Russell * Babies have bodies: Why Munakata's net fails to meet its own goals - L. B. Smith Reply by the author * Infant perseveration: rethinking data, theory and the role of modeling - Y. Munakata Reports * Special Section: Work from the Medical Research Council Cognitive Development Unit, London - John Morton, Uta Frith, Mark Johnson & Annette -Karmiloff-Smith * Is Dutch native English? Linguistic analysis by 2 month olds - A. Christophe & J. Morton * Object centred attention in 8 month olds - M. Johnson& R. O. Gilmore If you would like further information regarding Developmental Science, including notes for contributors and editorial information please log on to: Alternatively contact Rachel Manns at Blackwell Publishers by email on rmanns at blackwellpublishers.co.uk or by fax on +44 (0) 1865 381362 ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development Department of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 171 631-6582/6207 fax +44 171 631-6312 ================================================= From girosi at massa-intermedia.ai.mit.edu Thu Oct 22 12:34:21 1998 From: girosi at massa-intermedia.ai.mit.edu (Federico Girosi) Date: Thu, 22 Oct 98 12:34:21 EDT Subject: NEW BOOK by Partha Niyogi Message-ID: <9810221634.AA04234@massa-intermedia.mit.edu> NEW BOOK *** NEW BOOK *** NEW BOOK *** NEW BOOK *** NEW BOOK *** People might be interested in the following book on the relationship between learning, neural networks and generative grammar. Federico Girosi ----------------------------------------------------------------------- The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar Partha Niyogi (MIT and Bell Laboratories) [Kluwer Academic Publishers: ISBN 0792380819] Among other topics, this book brings together two important but very different learning problems within the same analytical framework. The first is the the problem of learning functional mappings using Neural Networks; the second is learning natural language grammars in the Principles and Parameters tradition of Chomsky. The two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. Grammars are boolean-valued, finite-dimensional, discrete (sympolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The objective of this book is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question -- how much information does it take to learn -- of both problems, it highlights their similarities and differences. It shows how "setting parameters " in the principles and parameters tradition of linguistic theory and "learning the connections" in neural networks are conceptually very similar problems and both have reasonable statistical formulations. At the same time, the results from learning theory are used to argue that both processes must be highly constrained for learning to happen. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. "The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar" is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science and statistics will find it particularly relevant. From Annette_Burton at Brown.edu Thu Oct 22 10:15:18 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Thu, 22 Oct 1998 10:15:18 -0400 Subject: Job Announcement Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring two Postdoctoral Research Associates. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that these fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and their interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to Steven Sloman, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The positions will begin September 1, 1999 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $45,000 per year. All materials must be received by Jan. 1, 1999, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT From kehagias at egnatia.ee.auth.gr Fri Oct 23 00:24:47 1998 From: kehagias at egnatia.ee.auth.gr (Thanasis Kehagias) Date: Thu, 22 Oct 1998 21:24:47 -0700 Subject: Data Allocation Message-ID: <199810221822.VAA25531@egnatia.ee.auth.gr> INTRODUCTION: With respect to my query regarding the DATA ALLOCATION problem, I got several interesting replies and biblio pointers. However, I feel that my original question has NOT been answered yet. I want to keep this message relatively brief. I first present a summary of the original question, then my own results up to date, then the responses by other people and some of my subsequent thoughts. At the end of this message you can find a short BIBLIOGRAPHY of related references. A more complete presentation and a larger BIBTEX file can be found at http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm ========================================================================= DATA ALLOCATION PROBLEM: The full description was given in a previous message on this list and can also be found at http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm . In short the problem is this: a time series is generated by alternate activation of two sources. The time series is observed, but the source process is not. It is required to find the source process. MY QUESTION: Suppose SOME online filter is used to separate the data on two classes, using incoming data to retrain the filter (and thus sharpen the separation criterion). What can be said about the CONVERGENCE of this filter to correct data allocation (alias separation or segmentation) using only VERY GENERAL assumptions about the filtering algorithm and the nature of the sources? MAIN RESULT: Here is the main result I have been so far able to obtain, stated VERY INFORMALLY: "THEOREM": Say that at time t the filter has accepted into its data pool M(t) samples from source1 and N(t) samples from source2. IF: the COMBINATION of: source1, source2, the data allocation criterion and the filter retraining algorithm satisfy certain SEPARABILITY conditions, THEN: as t goes to infinity, the ratio M(t)/N(t) goes either to zero or to one, with probability one. REMARK: This means that asymptotically, the filter will be trained on a data pool which contains predominantly either source1 data or source2 data. A bunch of (real) theorems along the above lines are stated and proved in Part III of "PREDICTIVE MODULAR NEURAL NETWORKS" (http://skir on.control.ee.auth.gr/~kehagias/thn/thn02b01.htm). Note that the conditions used in the above "Theorem" (as well as the true theorems) are quite general: the separability conditions do not refer to a particular algorithm. It follows that D. Ringach's remark (see below) is right to the point: some kind of SEPARABILITY assumption is necessary. SOME REMARKS BY OTHER PEOPLE: Some people responded by citing papers which use specific algorithms to solve the above problem. For instance, J. Kohlmorgen cited his own work, R. Rohwer mentioned the mixture model problem, , A. Storkey cited his own work and also papers by (a) Z. Ghahramani and G. Hinton, (b) A. Weigend et al. S. Waterhouse mentioned a range of possibilities (mixture models, mixtures of experts, HMMs, HM Trees) and proposed some possible algorithmic approaches; he also referred me to his WEB page (http://www.ultimode.com/stevew/) for related references. Finally, D. Ringach made a very significant observation, which I reproduce here: >I don't see how you can do anything unless you assume something about >the sources... Otherwise, how could you rule out the null hypothesis >that is a single source with the measured distribution of y(i)? MY RESPONSE: Most of the above people proposed specific ALGORITHMS to solve the problem. However, my orignal question was about: the data allocation CONVERGENCE problem, in an UNSUPERVISED ONLINE setting and at a GENERAL level (not related to specific algorithms). Most of the pointers I got contained no convergence analysis. At any rate, I am familiar with a number of similar algorithms for which ALGORITHM SPECIFIC convergence analyses ARE available (e.g. for the LVQ algorithm). I have attempted to give a GENERAL analysis in the PAPERS and BOOK referred to in my previous messages and in my home page (http://skiron.co ntrol.ee.auth.gr/~kehagias/thn/thn.htm). Also I present some relevant thoughts in (http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm). BIBLIOGRAPHY: Here are a FEW references about the data allocation problem (alias segmentation or data separation problem). You can find a more complete BIBTEX file at http://skiron. control.ee.auth.gr/~kehagias/thn/thn0301.bib . These references are mostly about algorithms to effect data allocation, either in an offline or online fashion; usually (but not always) the question of convergence is not treated theoretically. I should stress that the list below, as well as the bib tex file at my home site are by no means an exhaustive bibliographic coverage. They are just lists of some papers I have read and enjoyed. ---------------------------------------------------------------------------- 1. Gharahmani, Z. and Hinton, G.E.: Switching State-Space Models. (1998). Submitted for Publication. 2. Jordan, M.I. and R. A. Jacobs``Hierarchical mixtures of experts and the EM algorithm.'' . Neural Computation, 6, 181-214, 1994. 3. Jordan, M.I. and L. Xu. ``Convergence results for the EM approach to mixtures of experts architectures'' Neural Networks, 8, 1409-1431. 4. Kohlmorgen, J. and M=FCller, K.-R. and Pawelzik, K. (1998), Analysis of Drifting Dynamics with Neural Network Hidden Markov Models, in NIPS '97: Advances in Neural Information Processing Systems 10, MIT Press, to appear in 1998. 5. Levin, E. "Hidden control neural architecture modeling of nonlinear time varying systems and its applications". IEEE Trans. on Neural Networks 4 (1993) pp. 109-116. 6. Pawelzik, K. and Kohlmorgen, J. and M=FCller, K.-R. (1996), Annealed Competition of Experts for a Segmentation and Classification of Switching Dynamics, Neural Computation 8, pp. 340-356. 7. Storkey, A.J. Gaussian Processes for switching regimes. ICANN98. 8. Waterhouse, S. and Tony Robinson. "Classification Using Hierarchical Mixtures of Experts". Presented at IEEE Conference on Neural Networks and Signal Processing, 1994 9. Waterhouse, Steve and Tony Robinson. "Constructive Methods for Mixtures of Experts" (UK Version). Presented at NIPS: Neural Information Processing Systems 1995. 10. Weigend, A.S. and M. Mangeas and A. N. Srivastava (1995) Nonlinear gated experts for time series: discovering regimes and avoiding overfitting. 11. Xu, L. and M. I. Jordan``On convergence properties of the EM Algorithm for Gaussian mixtures'' . Neural Computation, 8, 129-151, 1996. ___________________________________________________________________ Ath. Kehagias --Assistant Prof. of Mathematics, American College of Thessaloniki --Research Ass., Dept. of Electrical and Computer Eng. Aristotle Univ., Thessaloniki, GR54006, GREECE --email: kehagias at egnatia.ee.auth.gr, kehagias at ac.anatolia.edu.gr --web: http://skiron.control.ee.auth.gr/~kehagias/index.htm From ASJagath at ntu.edu.sg Fri Oct 23 03:49:48 1998 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Dr)) Date: Fri, 23 Oct 1998 15:49:48 +0800 Subject: graduate student opportunities Message-ID: <6665AC0C667ED11186E308002BB487E10366CF41@exchange2> > School of Applied Science > Nanyang Technological University > Singapore > > Graduate Research Opportunities in Neuroimaging and Neurocomputing > > These are exciting opportunities for graduate students to research in > neuroimaging and neurocomputing, which are available through a > collaborative research project between Singapore Gamma Knife Center, > Singapore General Hospital, and the School of Applied Science, Nanyang > Technological University. > > The functional MRI scans of human brain during cognitive tasks, obtained > at the MRI facility of the Singapore General Hospital will be analyzed > using neural network paradigms and statistical techniques to detect human > brain activation. Students will join a research group analyzing functional > MR images and time-series to interpret human brain activities and infer > neuronal events in cognitive experiments. The research positions are > initially available to complete Masters degree, and may be continued > towards Ph.D. degree depending on the performance in the first year. > > Research students will receive a monthly stipend above S$1400 depending on > the qualifications. > The positions will be available during the year 1999. > > Prospective students should contact Dr. Jagath Rajapakse for further > details and applications: > > Dr. Jagath C. Rajapakse > School of Applied Science > Nanyang Technological University > Nanyang Avenue, Singapore 639798 > Email: asjagath at ntu.edu.sg > Phone: +65 790 5802 > > > From jose at tractatus.rutgers.edu Fri Oct 23 10:56:26 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Fri, 23 Oct 1998 10:56:26 -0400 Subject: RUTGERS-NEWARK JUNIOR POSITION in COG SCI/COG NEURO Message-ID: <3630991A.4B8F432@tractatus.rutgers.edu> The Department of Psychology of Rutgers University-Newark Campus anticipates making ONE tenure-track appointment in Cognitive Science or Cognitive Neuroscience at the Assistant Professor level. The Psychology Department has made three related appointments in the last two years and is expanding rapidly in this area. Candidates should have an active research program in memory, learning, categorization, attention, action, high-level vision or language. Particular interest will exist in candidates that combine one or more of the research interests above with FUNCTIONAL IMAGING (e.g. fMRI. PET, or ERP). Psychology has an ongoing affiliation/collaboration with UMDNJ fMRI laboratory. Review of applications will begin on January 8, 1999 but will continue to be accepted until the position is filled. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 2 reprints to Professor S. J. Hanson, Chair, Department of Psychology ? Cognitive Science Search, Rutgers University, Newark, NJ 07102. Email enquiry?s can be made to cogsci at psychology.rutgers.edu, Also see http://www.psych.rutgers.edu From cindy at cns.bu.edu Thu Oct 22 15:45:21 1998 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Thu, 22 Oct 1998 15:45:21 -0400 Subject: May 1999 Conference Message-ID: <199810221945.PAA01781@retina.bu.edu> ***** CALL FOR PAPERS ***** THIRD INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 26-29, 1999 Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. CONFIRMED INVITED SPEAKERS INCLUDE: Andreas Andreou Randolph Blake Rodney Brooks Gail Carpenter Dario Floreano Joaquin Fuster Paolo Gaudiano Charles Gilbert Larry Gillick Steven Greenberg Stephen Grossberg Michael Hasselmo Joseph LeDoux John Lisman Ennio Mingolla Tomaso Poggio Daniel Schacter Shihab Shamma Richard Shiffrin Nobuo Suga David van Essen Steven Zucker There will be contributed oral and poster sessions on each day of the conference. CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * other * cognition, planning, and attention Contributed Abstracts must be received, in English, by January 29, 1999. Notification of acceptance will be given by February 28, 1999. A meeting registration fee of $45 for regular attendees and $30 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 15, 1999. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and a first and second choice from among the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to cover meeting travel and living costs. The deadline to apply for fellowship support is January 29, 1999. Applicants will be notified by February 28, 1999. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. REGISTRATION FORM Third International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 26, 1999 Meeting: May 27-29, 1999 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $70 Conference plus Tutorial (Regular) ( ) $45 Conference plus Tutorial (Student) ( ) $45 Conference Only (Regular) ( ) $30 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From cns-cas at cns.bu.edu Thu Oct 22 11:17:46 1998 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Thu, 22 Oct 1998 11:17:46 -0400 Subject: Faculty Opening CNS-BU Message-ID: <199810221517.LAA18379@cochlea.bu.edu> NEW FACULTY IN COGNITIVE AND NEURAL SYSTEMS AT BOSTON UNIVERSITY Boston University seeks an assistant or associate professor for its graduate Department of Cognitive and Neural Systems. The department offers an integrated curriculum of psychological, neurobiological, and computational concepts, models, and methods in the fields of computational neuroscience, connectionist cognitive science, and neural network technology in which Boston University is a leader. Candidates should have an outstanding research profile, preferably including extensive analytic or computational research experience in modeling a broad range of nonlinear neural networks, especially in one or more of the areas: vision and image processing, adaptive pattern recognition, cognitive information processing, speech and language, adaptive-sensory motor control, and neural network technology. Send a complete curriculum vitae and three letters of recommendation to Search Committee, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston University, Boston, MA 02215. Boston University is an Equal Opportunity/Affirmative Action employer. (please post) From lambri at ifi.unizh.ch Wed Oct 28 10:02:07 1998 From: lambri at ifi.unizh.ch (Dimitrios Lambrinos) Date: Wed, 28 Oct 1998 15:02:07 +0000 Subject: Ph.D. position at AILab, Zurich Message-ID: <363731EF.D0588F50@ifi.unizh.ch> ------------------------------------------ Position for a Ph.D. student in Biorobotics at the AILab, University of Zurich ------------------------------------------ A new Ph.D. student position is open at the Artificial Intelligence Laboratory, Dept. of Computer Science of the University of Zurich. Availability: Immediately or at earliest convenience. Continuing previous work conducted at the AILab, this research will be focusing on Biorobotics, i.e. on building autonomous agents based on biological findings, that are capable of robust visually guided behavior in complex real-world environments. Biorobotics, with its goal of bringing together biology/neurobiology, engineering and computer science has enormous potential. On the one hand it will change our perception of biological intelligence, on the other it will inspire and influence the way we design, build, and use information technology. The main task of the agents will be to safely navigate in complex environments. This will require scaling-up previous theoretical and practical work. The challenge will be, first to demonstrate that mechanisms which are thought to be employed by natural systems can be implemented in real-world artifacts, and second, that such mechanisms, though very parsimonious, are sufficient for achieving complex behavior. If the above challenges capture your interest, and you would to become a member of an international research team conducting transdisciplinary work, submit a curriculum vitae, statement of research interests, and names of three referees ASAP to: Corinne Maurer Dept. of Computer Science University of Zurich Winterthurerstrasse 190 CH - 8057 Zurich, Switzerland E-mail: maurer at ifi.unizh.ch Phone: 41 - 1 - 635 43 31 Fax: 41 - 1 - 635 68 09 Profile: Applicants should have an MSc degree, or equivalent, in one of the following areas: computer science, electrical or mechanical engineering, biology, neurobiology, physics, mathematics, cognitive science, or related disciplines. He/she should have good programming skills (C, C++, etc.) preferably experience with robot programming, knowledge of electronics, and strong scientific background. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab as well as other administrative tasks required by the computer science department. Financial: The salary will be according to the specification of the Swiss National Science Foundation. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. From smyth at sifnos.ics.uci.edu Wed Oct 28 17:41:05 1998 From: smyth at sifnos.ics.uci.edu (Padhraic Smyth) Date: Wed, 28 Oct 1998 14:41:05 -0800 Subject: Tenure track faculty position at UC Irvine Message-ID: <9810281441.aa26794@paris.ics.uci.edu> Dear Connectionists FYI, the tenure-track position advertised below encompasses research topics closely related to neural computation, such as computational statistics and computational biology. I would be grateful if you would pass this information along to any of your colleagues or students who may be interested. I am happy to answer any general questions about the department and UCI if you wish to contact me by email. Padhraic Smyth Associate Professor Information and Computer Science University of California, Irvine. smyth at ics.uci.edu Open Faculty Position in Information and Computer Science at UC Irvine http://www.ics.uci.edu/interfac.html The Department of Information and Computer Science (ICS) has a tenure-track position open in the general area of interdisciplinary applications of computing. Research emphases include areas such as computational statistics, scientific data visualization, computer graphics and animation, computational biology, medical informatics, information organization, storage, retrieval and visualization. The available position is at the assistant professor level, but exceptional candidates from all ranks will be considered. In all cases, we are looking for applicants with a Ph. D. degree in Computer Science or a related field, and strong research credentials as evidenced by scholarly publications. Applicants for senior positions must also demonstrate a proven track record in original research and teaching activities. The ICS Department is organized as an independent campus unit reporting to the Executive Vice Chancellor. It runs the second most popular major at UCI and has designed an undergraduate honors program that attracts the campus' most qualified students. ICS faculty are involved in the forefront of research in the emerging areas of the computer science discipline such as multimedia/embedded computing, knowledge-discovery in databases, bioinformatics and the role of information in computer science and society. The faculty has effective interdisciplinary ties to colleagues in biology, cognitive science, engineering, management, medicine, and the social sciences. The Department currently has 32 full-time faculty and 125 Ph.D. students involved in various research areas including computer science theory, artificial intelligence, networks and distributed systems, databases, multimedia systems, computer systems design, software/software engineering, human-computer interaction and computer-supported cooperative work. ICS at UC Irvine represents one of the fastest growing departments and a computer science program that builds upon our strengths in core as well as growth areas of computer science. Although UCI is a young university, it has attained remarkable stature in the past 3 decades. Two Nobel prizes were recently awarded to UCI faculty. UCI is located three miles from the Pacific Ocean near Newport Beach, approximately forty miles south of Los Angeles. Irvine is consistently ranked among the safest cities in the U.S. and has an exceptional public school system. The campus is surrounded by high-technology companies that participate in an active affiliates program. Both the campus and the area offer exciting professional and cultural opportunities. Mortgage and housing assistance are available including newly built, for-sale housing located on campus and within short walking distance from the department. Applicants should send a cover letter indicating that they are applying for the "interdisciplinary applications of computing faculty position, Position E", a CV, sample papers and contact information for five references to: ICS Faculty Position E c/o Joy Schuler> Department of Information and Computer Science University of California, Irvine Irvine, CA 92697-3425. Application screening will begin immediately upon receipt of curriculum vitae. Maximum consideration will be given to applications received by December 1, 1998. Salaries commensurate with experience. The University of California is an Equal Opportunity Employer, committed to excellence through diversity. From MA_S435 at kingston.ac.uk Thu Oct 8 08:17:14 1998 From: MA_S435 at kingston.ac.uk (Dimitris Tsaptsinos) Date: Thu, 8 Oct 1998 13:17:14 +0100 Subject: CFP - EANN99 Message-ID: International Conference on Engineering Applications of Neural Networks (EANN '99) Fifth International Conference on Engineering Applications of Neural Networks (EANN '99) Warsaw, Poland 13-15 September 1999 First Call for Papers The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. Summaries of two pages (about 1000 words) should be sent by e- mail. Various sessions are organised and the responsibilities for some of the sessions have been divided to some extent from this year. The two page abstracts related to the following areas may be sent directly to the person(s) coordinating the session. Control systems (E. Tulunay, etulunay at ed.eee.metu.edu.tr and A. Ruano, aruano at ualg.pt) Vision/Image processing (S. Draghici, sod at cs.wayne.edu) Hybrid systems (J. Fernandez de Canete, canete at ctima.uma.es and D. Tsaptsinos, D.Tsaptsinos at kingston.ac.uk) Process Engineering (R. Baratti, baratti at unica.it) Biomedical systems (W. Duch, duch at phys.uni.torun.pl) Metallurgy (P. Myllykoski, pirkka.myllykoski at hut.fi) Meteorology (C. Schizas, schizas at ucy.ac.cy) Papers which do not fit any of the above categories can be sent to the common address of the conference which is eann99 at phys.uni.torun.pl. Deadline: 15 February 1999 Format: in plain text format (for other formats please email eann99 at the previous address) Please mention two to four keywords. Submissions will be reviewed. For information on earlier EANN conferences see the www pages at http://www.abo.fi/~abulsari/EANN98.html. The local organisers are putting together a web page at http://www.phys.uni.torun.pl/eann99/ Notification of acceptance will be sent around 15 March. The papers will be expected by 15 April. All papers will be upto 6 pages in length. Authors are expected to register by 15 April. Organising committee R. Baratti, University of Cagliari, Italy L. Bobrowski, Polish Academy of Science, Poland A. Bulsari, Nonlinear Solutions Oy, Finland W. Duch, Nicholas Copernicus University, Poland J. Fernandez de Canete, University of Malaga, Spain A. Ruano, University of Algarve, Portugal D. Tsaptsinos, Kingston University, UK National program committee L. Bobrowski, IBIB Warsaw, A. Cichocki, RIKEN, Japan W. Duch, Nicholas Copernicus University T. Kaczorek, Warsaw Polytechnic J. Korbicz, Technical University of Zielona Gora L. Rutkowski, Czestochowa Polytechnic R. Tadeusiewicz, Academy of Mining and Metallurgy, Krakow Z. Waszczyszyn, Krakow Polytechnic International program committee (to be confirmed, extended) S. Cho, Pohang University of Science and Technology, Korea T. Clarkson, King's College, UK S. Draghici, Wayne State University, USA G. Forsgr?n, Stora Corporate Research, Sweden I. Grabec, University of Ljubljana, Slovenia A. Iwata, Nagoya Institute of Technology, Japan C. Kuroda, Tokyo Institute of Technology, Japan H. Liljenstr?m, Royal Institute of Technology, Sweden L. Ludwig, University of Tubingen, Germany M. Mohammadian, Monash University, Australia P. Myllykoski, Helsinki University of Technology, Finland A. Owens, DuPont, USA R. Parenti, Ansaldo Ricerche, Italy F. Sandoval, University of Malaga, Spain C. Schizas, University of Cyprus, Cyprus E. Tulunay, Middle East Technical University, Turkey S. Usui, Toyohashi University of Technology, Japan P. Zufiria, Polytechnic University of Madrid, Spain Electronic mail is not absolutely reliable, so if you have not heard from the conference secretariat after sending a summary, please contact again. You should receive an abstract number in a couple of days after the submission. Dr Dimitris Tsaptsinos Senior Lecturer School of Mathematics Kingston University Penhryn Road Kingston-Upon-Thames Surrey KT1 2EE From martym at cs.utexas.edu Fri Oct 30 20:14:39 1998 From: martym at cs.utexas.edu (martym@cs.utexas.edu) Date: Fri, 30 Oct 1998 19:14:39 -0600 Subject: connectionist NLP software, papers, web demos Message-ID: <199810310114.TAA17159@fez.cs.utexas.edu> Dear Connectionists: The following software package for connectionist natural language processing and papers based on it are available from the UTCS Neural Networks Research Group website http://www.cs.utexas.edu/users/nn. Live demos of the software, including the systems described in the papers, can be run remotely from the research description page http://www.cs.utexas.edu/users/nn/pages/research/nlp.html. ======================================================================= Software: MIR: RAPID PROTOTYPING OF NEURAL NETWORKS FOR SENTENCE PROCESSING http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#mir The MIR software package has been designed for rapid prototyping of typical architectures used in NLP research (such as SRN, RAAM, and SOM), that depend heavily on (one or more) lexicons, and it can be easily extended to handle other architectures as well. It has been designed to be simple to install, modify, and use, while retaining as much flexibility as possible. To this end, the package is written in C using the TCL/TK libraries. The package includes a number of basic commands and widgets, so that the user can quickly set up, train, and test various architectures and visualize their dynamics. High-level scripts with training/testing data are also provided as examples of running complete experiments with MIR. ======================================================================= Papers: ======================================================================= SARDSRN: A NEURAL NETWORK SHIFT-REDUCE PARSER Marshall R. Mayberry, III, and Risto Miikkulainen. Technical Report AI98-275, Department of Computer Sciences, The University of Texas at Austin, 1998 (12 pages). http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#mayberry.sardsrn.ps.Z Simple Recurrent Networks (SRNs) have been widely used in natural language tasks. SARDSRN extends the SRN by explicitly representing the input sequence in a SARDNET self-organizing map. The distributed SRN component leads to good generalization and robust cognitive properties, whereas the SARDNET map provides exact representations of the sentence constituents. This combination allows SARDSRN to learn to parse sentences with more complicated structure than can the SRN alone, and suggests that the approach could scale up to realistic natural language. ======================================================================= DISAMBIGUATION AND GRAMMAR AS EMERGENT SOFT CONSTRAINTS Risto Miikkulainen and Marshall R. Mayberry, III. In B. J. MacWhinney (editor), Emergentist Approaches to Language. Hillsdale, NJ: Erlbaum, in press (16 pages). http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#miikkulainen.emergent.ps.Z When reading a sentence such as "The diplomat threw the ball in the ballpark for the princess" our interpretation changes from a dance event to baseball and back to dance. Such on-line disambiguation happens automatically and appears to be based on dynamically combining the strengths of association between the keywords and the two word senses. Subsymbolic neural networks are very good at modeling such behavior. They learn word meanings as soft constraints on interpretation, and dynamically combine these constraints to form the most likely interpretation. On the other hand, it is very difficult to show how systematic language structures such as relative clauses could be processed in such a system. The network would only learn to associate them to specific contexts and would not be able to process new combinations of them. A closer look at understanding embedded clauses shows that humans are not very systematic in processing grammatical structures either. For example, "The girl who the boy who the girl who lived next door blamed hit cried" is very difficult to understand, whereas "The car that the man who the dog that had rabies bit drives is in the garage" is not. This difference emerges from the same semantic constraints that are at work in the disambiguation task. In this chapter we will show how the subsymbolic parser can be combined with high-level control that allows the system to process novel combinations of relative clauses systematically, while still being sensitive to the semantic constraints. From hansa at mincom.com Thu Oct 1 00:07:56 1998 From: hansa at mincom.com (Hans Andersen) Date: Thu, 01 Oct 1998 14:07:56 +1000 Subject: Ph.D. Thesis on Neural/Fuzzy Control available. Message-ID: <3613001C.E1D7C8F5@mincom.com> Hi, My recently awarded Ph.D. thesis is now available from the following web-page: http://www.elec.uq.edu.au/~annis/papers/HansThesis/theCOEM.html The abstract and other details are included at the bottom of this message. Regards, Hans Christian Andersen, | E-mail: | hansa at mincom.com | | Department of Ph.D. work: | Department of Computer Science and Electrical Engineering, | University of Queensland, | St Lucia, Brisbane, Qld, 4072, | Australia. ---------------------------------------------------------------------- The Controller Output Error Method ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Ph.D. Thesis by: Hans Christian Asminn Andersen Supervised by: Dr Louis Westphal in the field of: Electrical Engineering at the: Department of Computer Science and Electrical Engineering, University of Queensland, Brisbane, Australia. Abstract: This thesis proposes the Controller Output Error Method (COEM) for adaptation of neural and fuzzy controllers. Most existing methods of neural adaptive control employ some kind of plant model which is used to infer the error of the control signal from the error at the plant output. The error of the control signal is used to adjust the controller parameters such that some cost function is optimized. Schemes of this kind are generally described as being indirect. Unlike these, COEM is direct since it does not require a plant model in order to calculate the error of the control signal. Instead it calculates the control signal error by performing input matching. This entails generating two control signals; the first control signal is applied to the plant and the second is inferred from the plant's response to the first control signal. The controller output error is the difference between these two control signals and is used by the COEM to adapt the controller. The method is shown to be a viable strategy for adaptation of controllers based on nonlinear function approximation. This is done by use of mathematical analysis and simulation experiments. It is proven that, provided a given controller is sufficiently close to optimal at the commencement of COEM-adaptation, its parameters will converge, and the control signal and the output of the plant being controlled will be both bounded and convergent. Experiments demonstrate that the method yields performance which is comparable or superior to that yielded by other neural and linear adaptive control paradigms. In addition to these results, this thesis shows the following: * The convergence time of the COEM may be greatly reduced by performing more than one adaptation during each sampling period. * It is possible to filter a reference signal in order to help ensure that reachable targets are set for the plant. * An adaptive fuzzy system may be prevented from corrupting the intuitive inter-pretation upon which it was originally designed. * Controllers adapted by COEM will perform best if a suitable sampling rate is selected. * The COEM may be expected to work as well on fuzzy controllers as it does on neural controllers. Furthermore, the extent of the functional equivalence between certain types of neural networks and fuzzy inference systems is clarified, and a new approach to the matrix formulation of a range of fuzzy inference systems is proposed. From kehagias at egnatia.ee.auth.gr Thu Oct 1 14:10:05 1998 From: kehagias at egnatia.ee.auth.gr (Thanasis Kehagias) Date: Thu, 01 Oct 1998 11:10:05 -0700 Subject: THE DATA ALLOCATION PROBLEM Message-ID: <3.0.5.32.19981001111005.007a08e0@egnatia.ee.auth.gr> THE DATA ALLOCATION PROBLEM I currently got interested in the following problem and I would be grateful for any feedback you can give me (mailto:kehagias at egnatia.ee.auth.gr). The Setup: Consider a collection of data: y(1), y(2), y(3), ..., generated by more than one sources. At time t one of the sources is activated (perhaps randomly) and generates the datum y(t). We want to identify the number of active sources and extract some information regarding each source (e.g. an input/output model, or some statistics such as mean value, standard deviation etc. of the source's output). No a priori information is available regarding the number, behavior etc. of the sources. In particular, the observed data are unlabelled, i.e. it is not known which source is active at time t. The Online Data Allocation Task: It seems to me that in such a situation the major task is data allocation. I mean this: if the observed data were partitioned into groups, each group containing data generated by a single source, then each data group could be used to train a model for the respective source. Generally speaking, training on clean data groups should not be too hard. However, since the data are not labelled, it is not immediately clear how to allocate them between groups. As I will explain a little later, the problem seems harder for the online case (with a continuously incoming stream of data) than for the offline case, where a finite data set is involved. The Convergence Question: Special cases of the above problem and various solutions have appeared in the literature. I am interested in obtaining quite general sufficient (and necessary ?) conditions for an online data allocation process to converge to a correct solution. By "correct solution", I mean a partition of the observed data into groups such that every group contains predominantly data from one source and every source corresponds to only one data group. The convergence conditions should be fairly general, so as to allow a unified treatment of many different data allocation algorithms and different kinds of sources (and data). We have obtained some results, which appear in our recent book (announced in a separate posting) and in a series of papers (also announced in a separate posting). I summarize our results in my web site at http://skiron.control.ee.auth.gr/~kehagias/thn/thn030.htm At this point I am interested in getting some feedback regarding: possible approaches to the problem, relevant biblio pointers and so on. I already have a modestly sized bibliography on this. I will summarize all responses and post. ___________________________________________________________________ Ath. Kehagias --Assistant Prof. of Mathematics, American College of Thessaloniki --Research Ass., Dept. of Electrical and Computer Eng. Aristotle Univ., Thessaloniki, GR54006, GREECE --email: kehagias at egnatia.ee.auth.gr, kehagias at ac.anatolia.edu.gr --web: http://skiron.control.ee.auth.gr/~kehagias/index.htm From Frederic.Alexandre at loria.fr Thu Oct 1 05:01:19 1998 From: Frederic.Alexandre at loria.fr (Frederic Alexandre) Date: Thu, 1 Oct 1998 11:01:19 +0200 (MET DST) Subject: Postdoctoral positions Message-ID: <199810010901.LAA13409@wernicke.loria.fr> LORIA/INRIA Lorraine computer science laboratory in Nancy, France Two postdoctoral positions are available for developping computational neurosciences models in Nancy, France, from January to July 99. Our team: the CORTEX team is developping connectionist models, inspired with biology, for perception, reasoning and autonomous behavior. Belonging to a computer science lab, our main goal is to propose effective models for robotics, speech and image processing. The two positions: they will insert in our two current projects: First a model of hippocampus-cortex connections for the internal representation of the environment; Second a model of the prefrontal cortex for the intelligent exploitation of this representation (strategy, planning). The candidates: they should be well-trained in connectionist modeling, first from a computer science point of view. Experience in robotics, autonomous behavior and neurosciences would be highly appreciated. For further information, contact: ---------------------------------------------------------------------------- Frederic ALEXANDRE Tel: (+33/0) 3 83 59 20 53 INRIA-Lorraine/LORIA-CNRS Fax: (+33/0) 3 83 41 30 79 BP 239 E-mail: falex at loria.fr 54506 Vandoeuvre-les-Nancy Cedex http://www.loria.fr/~falex FRANCE ---------------------------------------------------------------------------- From simon.schultz at psy.ox.ac.uk Thu Oct 1 05:56:51 1998 From: simon.schultz at psy.ox.ac.uk (Simon Schultz) Date: Thu, 01 Oct 1998 10:56:51 +0100 Subject: Thesis available Message-ID: <361351E3.2F1C@psy.ox.ac.uk> The following D.Phil. thesis is now available for downloading: ----------------------------------------------------------------- Information encoding in the mammalian cerebral cortex Simon R. Schultz Corpus Christi College, Oxford Short abstract: This thesis describes new techniques for studying information encoding and transmission in the mammalian nervous system. The underlying theme of the thesis is the use of information theory to quantitatively study real and model neuronal systems. The thesis begins with an analytical calculation of the information that can be conveyed by a feedforward network of threshold-linear neurons. The replica-symmetric solution for the mutual information is found to be valid at all but low noise values. This method is then used to make a quantitative calculation of the information that can be conveyed by the Schaffer collaterals, which project from hippocampal subregion CA3 to subregion CA1. The effects on information transmission of a number of details of the anatomy of the projection are explored, including convergence, divergence and topography of connectivity. Information theory is then applied to the analysis of data from neurophysiological recordings, by quantifying the information encoded in the responses (action potentials) of neural ensembles about environmental correlates. The decoding approach to estimating the information contained in the responses of populations of cells is examined in the limit of short time windows. It is shown that in this physiologically pertinent limit, decoding algorithms which estimate the full probability distribution must fail, whereas maximum likelihood algorithms remain accurate. The metric content, or amount of structure in the neuronal activity, is found to have a residual component at short time windows which is related to the instantaneous information transmission rate. The equation for mutual information is then approximated by a series expansion to second order, and it is found that while as has been previously noted, the first order terms depend only on the firing rates, the second order terms break down into rate and correlational components of the information. This leads to a new procedure for quantifying the relative contributions of correlations (such as synchronisation) and firing rates to neural information encoding. The practicality of this procedure is demonstrated by applying it to data recorded from the primate medial and inferior temporal lobes. ----------------------------------------------------------------- The thesis is available either as a gzipped postscript file: http://www.mrc-bbc.ox.ac.uk/~schultz/thesis.ps.gz or as individual chapters: http://www.mrc-bbc.ox.ac.uk/~schultz/theschaps.html -- ----------------------------------------------------------------------- Simon Schultz Department of Experimental Psychology also: University of Oxford Corpus Christi College South Parks Rd., Oxford OX1 3UD Oxford OX1 4JF Phone: +44-1865-271419 Fax: +44-1865-310447 http://www.mrc-bbc.ox.ac.uk/~schultz/ ----------------------------------------------------------------------- From oby at cs.tu-berlin.de Fri Oct 2 09:55:19 1998 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Fri, 2 Oct 1998 15:55:19 +0200 (MET DST) Subject: preprints available Message-ID: <199810021355.PAA02999@pollux.cs.tu-berlin.de> Dear connectionists, attached please find abstracts and preprint locations of five manuscripts on: optical recording of brain activity: 1. tissue optics simulations for depth-resolved optical recording 2. ICA analysis of optical recording data biological modelling: 3. contrast adaptation, fast synaptic depression, and Infomax in visual cortical neurons 4. the role of non-linear lateral interactions in cortical map formation ANN theory: 5. optimal hyperplane classifiers for pseudo-Euclidean and pairwise data Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Simulation of Scanning Laser Techniques for Optical Imaging of Blood-Related Intrinsic Signals M. Stetter and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Optical Imaging of intrinsic signals detects neural activation patterns by taking video images of the local activity-related changes in the light intensity reflected from neural tissue (intrinsic signals). At red light (605nm), these signals are mainly caused by local variations of the tissue absorption following deoxygenation of blood. In this work, we characterize the image generation process during Optical Imaging by Monte Carlo simulations of light propagation through a homogeneous model tissue equipped with a local absorber. Conventional video-imaging and Scanning Laser imaging are compared to each other. We find that, compared to video imaging, Scanning Laser techniques drastically increase both the contrast and the lateral resolution of optical recordings. Also, the maximum depth up to which the signals can be detected, is increased by roughly a factor of 2 using Scanning Laser Optical Imaging. Further, the radial profile of the diffuse reflectance pattern for each pixel is subject to changes which correlate with the depth of the absorber within the tissue. We suggest a detection geometry for the online measurement of these radial profiles, which can be realized by modifying a standard Scanning Laser Ophthalmoscope. in: Journal of the Optical Society of America A, in press available at: http://ni.cs.tu-berlin.de/publications/#journals ----------------------------------------------------------------------------- Blind separation of spatial signal patterns from optical imaging records. I. Schiessl^1, M. Stetter^1, J. Mayhew^2, S. Askew^2, N. McLoughlin^3, J. Levitt^4, J. Lund^4, and K. Obermayer^5. 1 Fachbereich Informatik, Technische Universitaet Berlin 2 AIVRU, University of Sheffield 3 Department of Neurobiology, Harvard Medical School 4 Institute of Ophthalmology, UCL Optical imaging of intrinsic signals measures two-dimensional neuronal activity patterns by detecting small activity-related changes in the light reflectance of neural tissue. We test, to what extend blind source separation methods, which are based on the spatial independence of different signal components, are suitable for the separation of these neural-activity related signal components from nonspecific background variations of the light reflectance. Two ICA algorithms (Infomax and kurtosis optimization) and blind source separation by extended spatial decorrelation are compared to each other with respect to their robustness against sensor noise, and are applied to optical recordings from macaque primary visual cortex. We find that extended spatial decorrelation is superior to both the ICA algorithms and standard methods, because it explicitely takes advantage of the spatial smoothness of the intrinsic signal components. in: Proceedings of the ICA '99 conference, 1999 (accepted) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex. P. Adorjan and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin The contrast response function (CRF) of many neurons in the primary visual cortex saturates, and shifts towards higher contrast values following prolonged presentation of high contrast visual stimuli. Using a recurrent neural network of excitatory spiking neurons with adapting synapses we show that both effects could be explained by a fast and a slow component in the synaptic adaptation. The fast component - a short term synaptic depression component - leads to a saturation of the CRF and a phase advance in the cortical cells' response to high contrast stimuli. The slow component is derived from an adaptation of the probability of the synaptic transmitter release, and changes such that the mutual information between the input and the output of a cortical neuron is maximal. This component - given by the infomax learning rule - explains contrast adaptation of the averaged membrane potential (DC component) as well as the surprising experimental results, that the stimulus modulated component (F1 component) of a cortical cell's membrane potential adapt only weakly. Based on our results we propose a new experimental method to estimate the strength of the effective excitatory feedback to a cortical neuron, and we also suggest a relatively simple experimental test to justify our hypothesized synaptic mechanism for contrast adaptation. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- The role of lateral cortical competition in ocular dominance development. C. Piepenbrock and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin Lateral competition within a layer of neurons sharpens and localizes the response to an input stimulus. Here, we investigate a model for the activity dependent development of ocular dominance maps which allows to vary the degree of lateral competition. For weak competition, it resembles a correlation-based learning model and for strong competition, it becomes a self-organizing map. Thus, in the regime of weak competition the receptive fields are shaped by the second order statistics of the input patterns, whereas in the regime of strong competition, the higher moments and ``features'' of the individual patterns become important. When correlated localized stimuli from two eyes drive the cortical development we find (i) that a topographic map and binocular, localized receptive fields emerge when the degree of competition exceeds a critical value and (ii) that receptive fields exhibit eye dominance beyond a second critical value. For anti-correlated activity between the eyes, the second order statistics drive the system to develop ocular dominance even for weak competition, but no topography emerges. Topography is established only beyond a critical degree of competition. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Classification on pairwise proximity data. T. Graepel, R. Herbrich, P. Bollmann-Sdorra, and K. Obermayer. Fachbereich Informatik, Technische Universitaet Berlin We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of the data items and is thus more general than the standard approach of using Euclidean feature vectors, from which pairwise proximities can always be calculated. Our first approach is based on a combined linear embedding and classification procedure resulting in an extension of the Optimal Hyperplane algorithm to pseudo-Euclidean data. As an alternative we present another approach based on a linear threshold model in the proximity values themselves, which is optimized using Structural Risk Minimization. We show that prior knowledge about the problem can be incorporated by the choice of distance measures and examine different metrics w.r.t. their generalization. Finally, the algorithms are successfully applied to protein structure data and to data from the cat's cerebral cortex. They show better performance than K-nearest-neighbor classification. in: Advances in Neural Information Processing Systems NIPS 11, 1999 (accepted). available at: http://ni.cs.tu-berlin.de/publications/#conference From harnad at coglit.soton.ac.uk Fri Oct 2 12:33:34 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Fri, 2 Oct 1998 17:33:34 +0100 (BST) Subject: Social Cognitive Bias: PSYCOLOQUYy Call for Commentary Message-ID: Krueger: Social Cognitive Bias The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc To retrieve the article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.46 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.46.social-bias.1.krueger AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY: My contention is that social psychological research has depicted social perception in an excessively negative light by relying too much on demonstrations of various irrational biases. Normative models of good judgment have been too restrictive, and the prevalent testing strategy has equated good judgment with the truth of a null hypothesis. Rejections of such null hypotheses have then been interpreted as evidence for bias. I am particularly interested in learning how psychologists and methodologists respond to the idea that the use of multiple theories and methods will improve our understanding of social perception. I realize that my proposal is incomplete because breaking the predominance of the single ruling inference strategy (Null Hypothesis Significance Testing) may make it harder to draw comparisons between studies. How can the field preserve its coherence, while abandoning its traditional ways? ----------------------------------------------------------------------- psycoloquy.98.9.46.social-bias.1.krueger Fri Oct 2 1998 ISSN 1055-0143 (21 paragraphs, 41 references, 4 notes, 647 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1998 Joachim Krueger THE BET ON BIAS: A FOREGONE CONCLUSION? Joachim Krueger Department of Psychology Brown University, Box 1853 Providence, RI 02912 USA Joachim_Krueger at Brown.edu http://www.brown.edu/Departments/Psychology/faculty/krueger.html ABSTRACT: Social psychology has painted a picture of human misbehavior and irrational thinking. For example, prominent social cognitive biases are said to distort consensus estimation, self perception, and causal attribution. The thesis of this target article is that the roots of this negativistic paradigm lie in the joint application of narrow normative theories and statistical testing methods designed to reject those theories. Suggestions for balancing the prevalent paradigm include (a) modifications to the ruling rituals of Null Hypothesis Significance Testing, (b) revisions of what is considered a normative response, and (c) increased emphasis on individual differences in judgment. KEYWORDS: Bayes' rule, bias, hypothesis testing, individual differences probability, rationality, significance testing, social cognition, statistical inference To retrieve the article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.46 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.46.social-bias.1.krueger From sporns at nsi.edu Fri Oct 2 15:37:36 1998 From: sporns at nsi.edu (Olaf Sporns) Date: Fri, 02 Oct 1998 19:37:36 +0000 Subject: job applications invited Message-ID: <36152B7F.6158FBDB@nsi.edu> Job applications are invited: POSTDOCTORAL FELLOWS W.M. KECK MACHINE PSYCHOLOGY LABORATORY The Neurosciences Institute, located in San Diego, California, invites applications for POSTDOCTORAL FELLOWSHIPS to study biologically based models of behaving real world devices (robots) as part of the newly established W.M. Keck Machine Psychology Laboratory. Continuing previous research conducted at the Institute, this Laboratory will be focusing on the construction of autonomous robots, the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in computational neuroscience, robotics, computer science, behavioral or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three referees to: Dr. Olaf Sporns The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Fax: 619-626-2099 email: sporns at nsi.edu URL: http://www.nsi.edu. URL: http://www.nsi.edu/users/sporns/. Key reference: Almassy, N., G.M. Edelman, O. Sporns (1998) Behavioral constraints in the development of neuronal properties: A cortical model embedded in a real-world device. Cerebral Cortex 8:346-361. From ingber at ingber.com Sat Oct 3 09:37:05 1998 From: ingber at ingber.com (Lester Ingber) Date: Sat, 3 Oct 1998 08:37:05 -0500 Subject: Open R&D Positions in Computational Finance/Chicago Message-ID: <19981003083705.A7988@ingber.com> * Open R&D Positions in Computational Finance/Chicago DRW Investments, LLC, a proprietary trading firm based at the Chicago Mercantile Exchange, with a branch office in London, is expanding its research department to support trading. * Programmer/Analyst -- Full Time At least 1-2 years experience programming in C or C++. Must have excellent background in Math, Physics, or similar disciplines. Flexible hours in intense environment. Requires strong commitment to several ongoing projects with shifting priorities. See http://www.ingber.com/ for some papers and code used on current projects. Please email Lester Ingber with a resume or any questions regarding these positions. * Graduate Student(s) -- Part Time (1 or 2) We would like to sponsor theses in computational finance that might impact our trading practices. Would require at least weekly face-to-face contact with DRW personnel. See http://www.ingber.com/ for some papers on current projects. Please email Lester Ingber with a resume or any questions regarding these positions. * Systems Administrator -- Full Time The primary role is to oversee and manage a heterogeneous network. Essential skills required are Windows NT, any flavor of Unix, TCP/IP networking and some programming skills in any language. 1-2 years experience required. Please email Man Wei Tam with a resume or any questions regarding the position. -- /* Lester Ingber Lester Ingber Research * * PO Box 06440 Wacker Dr PO Sears Tower Chicago, IL 60606-0440 * * http://www.ingber.com/ ingber at ingber.com ingber at alumni.caltech.edu */ From bernabe at cnmx4-fddi0.imse.cnm.es Mon Oct 5 04:46:54 1998 From: bernabe at cnmx4-fddi0.imse.cnm.es (Bernabe Linares B.) Date: Mon, 5 Oct 1998 10:46:54 +0200 Subject: No subject Message-ID: <199810050846.KAA11836@cnm12.cnm.es> A non-text attachment was scrubbed... Name: not available Type: text Size: 529 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/970595a7/attachment-0001.ksh From becker at curie.psychology.mcmaster.ca Mon Oct 5 21:38:24 1998 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Mon, 5 Oct 1998 21:38:24 -0400 (EDT) Subject: faculty positions Message-ID: Dear connectionists, There are a number of open faculty positions at McMaster University which may interest you or your colleagues. Please feel free to pass this message on or post to other sites. Position in Psychology: Appended to the bottom of this message is an advertisement for a faculty position in the Department of Psychology at McMaster University, targeting Cognitive Psychology or Animal Behaviour. Computational modellers whose research intersects with the study of brain and behaviour are encouraged to apply. Please drop me a note if you plan on applying. Positions in Computer Science and Software Engineering: The Department of Computing and Software at McMaster also has a number of open positions. Several areas are being targeted but all outstanding applicants will be considered. See http://www.cas.mcmaster.ca This department has recently moved into the Faculty of Engineering and is expanding rapidly. Positions in Electrical Engineering: The Department of Electrical and Computer Engineering is also expanding rapidly, with information technology having been designated a priority area at McMaster. This department is presently targeting the areas of signal processing, computer networking and VLSI. For further details, see http://ece.mcmaster.ca/job_ft2.htm McMaster University is the first, and to our knowledge, the only university to offer an Honours Degree in Neural Computation. It is by many standards Canada's most research-intensive university - financial support from governments, foundations and business for our research projects is, on a per capita basis, the highest in the country. We have been recognized for several years as Canada's most innovative university. McMaster's Psychology Department is considered by many to be the best in Canada - for example, it ranks highest in the country in terms of publications per faculty member. Sincerely, Sue Becker, Associate Professor Dept. of Psychology -------------------------------------------------------------------------- FACULTY POSITION IN COGNITIVE PSYCHOLOGY/ANIMAL BEHAVIOUR The Department of Psychology at McMaster University invites applications from candidates eligible to be sponsored for a Natural Sciences and Engineering Research Council University Faculty Award (UFA), an award that is directed toward women. We plan to sponsor an application for a UFA, and will offer the successful applicant a tenure track position at the level of Assistant Professor. The duration of the UFA, 3 years initially with a possibility of renewal for a further 2 years, will count toward the normal probationary period for tenure. We are seeking candidates with an active research program in either animal behaviour or cognitive psychology. Preference for the cognition opening is for someone with an interest in memory or decision making, and whose research program extends to neuroscience/neuropsychology domains. Applications are encouraged from all qualified candidates, including aboriginal peoples, persons with disabilities, and members of visible minorities. Interested candidates should consult the eligibility criteria for the UFA on the NSERC website: http://www.nserc.ca/programs/sf/UFA_e. To apply, send a curriculum vitae, a short statement of research interests, a publication list with selected reprints, and three letters of reference to: Dr. Bruce Milliken, Department of Psychology, McMaster University, Hamilton, Ontario, CANADA L8S 4K1. Closing date for applications and supporting material is November 15, 1998. From at at coglit.soton.ac.uk Tue Oct 6 04:05:13 1998 From: at at coglit.soton.ac.uk (Adriaan Tijsseling) Date: Tue, 6 Oct 1998 09:05:13 +0100 Subject: PhD Thesis available: Connectionist Models of Categorization Message-ID: The following PhD thesis is available on our website (http://cogito.psy.soton.ac.uk/~at/CALM/): Connectionist Models of Categorization: A Dynamical View of Cognition by Adriaan Tijsseling Abstract The functional role of altered similarity structure in categorization is analyzed. 'Categorical Perception' (CP) occurs when equal-sized physical differences in the signals arriving at our sensory receptors are perceived as smaller within categories and larger between categories (Harnad, 1987). Our hypothesis is that it is by modifying the similarity between internal representations that successful categorization is achieved. This effect depends in part on the iconicity of the inputs, which induces a similarity preserving structure in the internal representations. Categorizations based on the similarity between stimuli are easier to learn than contra-iconic categorization; it is mainly to modify the latter in the service of categorization that the characteristic compression/separation of CP occurs. This hypothesis was tested in a series of neural net simulations of studies on category learning in human subject. The nets are first pre-exposed to the inputs and then given feedback on their performance. The behavior of the resulting networks was then analyzed and compared to human performance. Before it is given feedback, the network discriminates and categorizes input based on the inherent similarity of the input structure. With corrective feedback the net moves its internal representations away from category boundaries. The effect is that similarity of patterns that belong to different categories is decreased, while similarity of patterns from the same category is increased (CP). Neural net simulations make it possible to look inside a hypothetical black box of how categorization may be accomplished; it is shown how increased attention to one or more dimensions in the input and the salience of input features affect category learning. Moreover, the observed 'warping' of similarity space in the service of categorization can provide useful functionality by creating compact, bounded chunks (Miller, 1965) with category names that can then be combined into higher-order categories described by the symbol strings of natural language and the language of thought (Greco, Cangelosi, & Harnad, 1997). The dynamic models of categorization of the kind analyzed here can be extended to make them powerful models of neuro-symbolic processing (Casey, 1997) and a fruitful territory for future research. From terry at salk.edu Tue Oct 6 13:28:27 1998 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 6 Oct 1998 10:28:27 -0700 (PDT) Subject: NEURAL COMPUTATION 10:8 Message-ID: <199810061728.KAA29775@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 8 - November 15, 1998 ARTICLE Competition for Neurotrophic Factors: Mathematical Analysis T. Elliott and N. R. Shadbolt NOTE Why Does the Somatosensory Homunculus Have Hands Next to Face and Feet Next to Genitals?: An Hypothesis Martha J. Farah LETTERS Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input Richard Kempter, Wulfram Gerstner, J. Leo van Hemmen and H. Wagner Connecting Cortical And Behavioral Dynamics: Bimanual Coordination V. K. Jirsa, A. Fuchs, J.A.S. Kelso Constructive Incremental Learning from Only Local Information Stefan Schaal and Christopher G. Atkeson Information Maximization and Independent Component Analysis: Is There a Difference? D. Obradovic and G. Deco An Alternative Perspective on Adaptive Independent Component Analysis Algorithms Mark Girolami Density Estimation by Mixture Models with Smoothing Priors Akio Utsugi Complexity Issues in Natural Gradient Descent Method for Training Multilayer Perceptrons Howard Hua Yang and Shun-ichi Amari Almost Linear VC-Dimension Bounds for Piecewise Polynomial Networks Peter L. Bartlett, Vitaly Maiorov, and Ron Meir The Diabolo Classifier Holger Schwenk Online Learning from Finite Training Sets and Robustness to Input Bias Peter Sollich and David Barber Anti-Predictable Sequences: Harder to Predict Than Random Sequences Huaiyu Zhu and Wolfgang Kinzel ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From a.burkitt at medoto.unimelb.edu.au Tue Oct 6 19:30:48 1998 From: a.burkitt at medoto.unimelb.edu.au (Anthony BURKITT) Date: Wed, 07 Oct 1998 09:30:48 +1000 Subject: Preprints available Message-ID: <60E1B9CE4896D111A22700E02910059714719E@mail.medoto.unimelb.EDU.AU> The following two papers on the analysis of integrate and fire neurons are available at the site: http://www.medoto.unimelb.edu.au/people/burkitta/Pubs.html 1. Analysis of integrate and fire neurons: synchronization of synaptic input and spike output 2. New technique for analyzing integrate and fire neurons I'd welcome comments. Cheers, Tony Burkitt =============================================== Analysis of integrate and fire neurons: synchronization of synaptic input and spike output A. N. Burkitt and G. M. Clark A new technique for analyzing the probability distribution of output spikes for the integrate and fire model is presented. This technique enables us to investigate models with arbitrary synaptic response functions that incorporate both leakage across the membrane and a rise time of the postsynaptic potential. The results, which are compared with numerical simulations, are exact in the limit of a large number of small amplitude inputs. This method is applied to the synchronization problem, in which we examine the relationship between the spread in arrival times of the inputs (the temporal jitter of the synaptic input) and the resultant spread in the times at which the output spikes are generated (output jitter). The results of previous studies, which indicated that the ratio of the output jitter to the input jitter is consistently less than one and that it decreases for increasing numbers of inputs, are confirmed for three classes of the integrate and fire model. In addition to the previously identified factors of axonal propagation times and synaptic jitter, we identify the variation in the spike generating thresholds of the neurons and the variation in the number of active inputs as being important factors that determine the timing jitter in layered networks. Previously observed phase differences between optimally and suboptimally stimulated neurons may be understood in terms of the relative time taken to reach threshold. http://www.medoto.unimelb.edu.au/people/burkitta/synch.ps.zip Accepted for publication in Neural Computation (to appear) =============================================== New technique for analyzing integrate and fire neurons A. N. Burkitt and G. M. Clark By integrating over the distribution of arrival times of the afferent postsynaptic potentials (PSPs), the probability density of the summed potential is calculated. An output spike is generated when the potential reaches threshold, and the probability density of output spikes (the first-passage density) is calculated. This "integrated-input" technique enables the investigation of models that incorporate the decay of the membrane potential and rise time of the synaptic current. PSPs with a distribution of amplitudes, including inhibitory PSPs, are also analyzed. The results are exact in the limit of large numbers of small amplitude inputs. http://www.medoto.unimelb.edu.au/people/burkitta/cns98.ps.zip Presented at CNS*98, to appear in Neurocomputing (in 1999). ====================ooOOOoo==================== Anthony N. Burkitt The Bionic Ear Institute 384-388 Albert Street East Melbourne, VIC 3002 Australia Email: a.burkitt at medoto.unimelb.edu.au http://www.medoto.unimelb.edu.au/people/burkitta Phone: +61 - 3 - 9283 7510 Fax: +61 - 3 - 9283 7518 =====================ooOOOoo=================== From Annette_Burton at Brown.edu Tue Oct 6 12:22:56 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Tue, 6 Oct 1998 12:22:56 -0400 Subject: PROGRAM ANNOUNCEMENT Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science have just received a new NSF-supported Interdisciplinary Graduate Education, Research and training (IGERT) program, with support for Graduate Students. Could you please post the following message on your list. Thank you. Katherine Demuth Department of Cognitive and Lingusitic Sciences Brown University ------------- Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY GRADUATE TRAINING PROGRAM in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES Deadline for Applications: January 1, 1999 Brown University is actively recruiting graduate students for a new NSF-supported Interdisciplinary Graduate Education, Research and Training (IGERT) program in "Learning and Action in the Face of Uncertainty: Cognitive, Computational and Statistical Approaches". The use of probabilistic models and statistical methods has had a major impact on our understanding of language, vision, action, and reasoning. This training program provides students with the opportunity to integrate a detailed study of human or artificial systems for language acquisition and use, visual processing, action, and reasoning with appropriate mathematical and computational models. Students will be enrolled in one of the three participating departments (Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science) and will study an interdisciplinary program of courses in topics such as statistical estimation, cognitive processes, linguistics, and computational models. The aim of this program is to provide promising students with a mix of mathematical, computational and experimental expertise to carry out multidisciplinary collaborative research across the disciplines of Applied Mathematics, Computer Science, and Cognitive Science. Interested students should apply to the participating department closest to their area of interest and expertise, and should indicate their interest in the IGERT training program in their application. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program, application procedures, and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT or contact Dr. Julie Sedivy Department of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 USA Julie_Sedivy at brown.edu ***************************** Katherine Demuth Dept. of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 TEL: (401) 863-1053 FAX: (401) 863-2255 From barry at dcs.rhbnc.ac.uk Wed Oct 7 06:50:53 1998 From: barry at dcs.rhbnc.ac.uk (Barry Rising) Date: Wed, 7 Oct 1998 11:50:53 +0100 (BST) Subject: ANNOUNCEMENT: ASPeCT Fraud Detection Workshop Message-ID: ASPeCT (Advanced Security for Personal Communications Technologies) is a Europe wide research project which is part of the European Commissions' ACTS programme. One of the topics addressed by ASPeCT has been the investigation of fraud detection techniques for third generation mobile telecommunications using Neural Networks. On behalf ASPeCT I would like to invite you to the ASPeCT Fraud Detection Workshop to be held on Friday 16th October 1998 at Savill Court Conference Centre, Egham, Surrey, England. The aims of the workshop are to promote awareness of the problems of fraud detection in mobile telecom technology and credit risk assessment using automated techniques. The speakers are from the ASPeCT fraud detection project and experts from industry who will relay their experiences in this rapidly changing research area. The cost of the workshop is 50 UK Pounds, this will include lunch, coffee breaks and proceedings. Places are strictly limited. For more details see http://www.dcs.rhbnc.ac.uk/~barry or contact me via email. (barry at dcs.rhbnc.ac.uk) More information can be found about the ASPeCT project at: http://www.esat.kuleuven.ac.be/cosic/aspect Here you can also down-load some of the reports produced by the project. Best regards, Barry Rising Research Assistant, ASPeCT From higdon at stat.Duke.EDU Thu Oct 8 12:33:59 1998 From: higdon at stat.Duke.EDU (David Higdon) Date: Thu, 8 Oct 1998 12:33:59 -0400 (EDT) Subject: Postdoc Opportunity Message-ID: *** Postdoctoral Research Associate *** Institute of Statistics & Decision Sciences Duke University Applications are invited for a position as a post-doctoral research associate in the Institute of Statistics and Decision Sciences, Duke University. This is an initial three-year appointment, with a target starting date of May 1st, 1999. The position arises as part of a new multi-disciplinary research project on multi-scale modeling and simulation in scientific inference, funded by NSF. This project, conducted through the Center for Multi-Scale Modeling and Distributed Computing at Duke, combines research teams from statistics, hydrology, mathematics and computer science, and has key research focuses in challenging problems of spatial statistical modeling and advanced statistical computation. Applied project components concern important problems in environmental and petroleum hydrology. The project begins in May 1999 with an initial three year term. The Postdoctoral Research Associate will focus on research related to the development of novel multi-resolution spatial models, advanced statistical computation, and methodology for integrating data sources in an applied setting. The associate will work closely with ISDS researchers and investigators and postdoctoral associates in mathematics, computing and hydrology. Suitable applicants will have a PhD in Statistics or a related field. We particularly seek candidates with experience in spatial modeling and advanced scientific computing, and with inclinations towards cross-disciplinary research. Applications from suitably qualified women and minority candidates are particularly encouraged. Duke University is an Equal Opportunity/Affirmative Action Employer. For additional information, visit our web page at www.stat.duke.edu To apply, please send CV and 3 letters of reference to: Search Committee Box 90251 Duke University; Durham NC 27708-0251 Informal inquiries may be directed to Dave Higdon at higdon at stat.duke.edu or Mike West at mw at stat.duke.edu. From mel at lnc.usc.edu Thu Oct 8 15:45:57 1998 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 08 Oct 1998 12:45:57 -0700 Subject: NIPS Travel Grants Message-ID: <361D1675.D68DF3E9@lnc.usc.edu> Neural Information Processing Systems Conference NIPS*98 ------- TRAVEL GRANTS ------- ****** Deadline Oct. 19 ******** Limited funds will be available to support the travel of young investigators, post-doctoral fellows, and graduate students to NIPS. Awards will be based on both merit and need. The amount of aid will generally not exceed $400 for domestic (US) travel; increased amounts may be available for participants travelling from overseas. Conference registration is not covered by travel awards. In order to speed processing and reduce applicant workload, applications for travel grants MUST be submitted this year via the NIPS Conference web page form for this purpose: http://www.cs.cmu.edu/Groups/NIPS/NIPS98/Travel.html With the creation of this automated application system, there is no need to submit hardcopies of your application, as was indicated in the (now superceded) printed brochure. Even if you have already submitted an application to the NIPS Foundation office, you must still submit the web page form. Deadline for submission is Oct. 19. Notification of award will be emailed in early November. Travel award checks in US$ may be picked up upon registration at the conference. A photocopy of your airline ticket will be required. Applications that arrive after the deadline will be collected and considered as a separate pool in case any awarded travel funds are left uncollected at the conference. Bartlett Mel NIPS*98 Treasurer -- Bartlett W. Mel (213)740-0334, -3397(lab) Assistant Professor of Biomedical Engineering (213)740-0343 fax University of Southern California, OHE 500 mel at lnc.usc.edu, http://lnc.usc.edu US Mail: BME Department, MC 1451, USC, Los Angeles, CA 90089 Fedex: 3650 McClintock Ave, 500 Olin Hall, LA, CA 90089 From audeb at dai.ed.ac.uk Fri Oct 9 13:34:59 1998 From: audeb at dai.ed.ac.uk (Aude Billard) Date: Fri, 9 Oct 1998 18:34:59 +0100 Subject: Preprints available Message-ID: <19324.199810091734@osprey> The following paper "DRAMA, a connectionist architecture for control and learning in autonomous robots" is to appear in Adaptive Behaviour Journal, vol. 7:1 (January 1999). A preprint of the paper is available at the site: http://www.dai.ed.ac.uk/daidb/people/homes/audeb/publication.html This paper reports on the development of a novel connectionist architecture used for on-line learning of spatio-temporal regularities and time series in discrete sequences of inputs of an autonomous mobile robot. An on-line version of my PhD thesis, of which the paper reports some aspects, will soon be available. I am very grateful for any comments. Thank you for transmitting the message. Aude Billard ================================================================= DRAMA, a connectionist architecture for control and learning in autonomous robots, Billard A. and Hayes G. (1998), In Adaptive Behaviour Journal, vol. 7:1. This work proposes a connectionist architecture, DRAMA, for dynamic control and learning of autonomous robots. DRAMA stands for dynamical recurrent associative memory architecture. It is a time-delay recurrent neural network, using Hebbian update rules. It allows learning of spatio-temporal regularities and time series in discrete sequences of inputs, in the face of an important amount of noise. The first part of this paper gives the mathematical description of the architecture and analyses theoretically and through numerical simulations its performance. The second part of this paper reports on the implementation of DRAMA in simulated and physical robotic experiments. Training and rehearsal of the DRAMA architecture is computationally fast and inexpensive, which makes the model particularly suitable for controlling `computationally-challenged' robots. In the experiments, we use a basic hardware system with very limited computational capability and show that our robot can carry out real time computation and on-line learning of relatively complex cognitive tasks. In these experiments, two autonomous robots wander randomly in a fixed environment, collecting information about its elements. By mutually associating information of their sensors and actuators, they learn about physical regularities underlying their experience of varying stimuli. The agents learn also from their mutual interactions. We use a teacher-learner scenario, based on mutual following of the two agents, to enable transmission of a vocabulary from one robot to the other. Keywords: Time-delay recurrent neural network; Hebbian learning; spatio-temporal associations; unsupervised dynamical learning; autonomous robots. From emmanuel.pothos at psy.ox.ac.uk Mon Oct 12 14:24:42 1998 From: emmanuel.pothos at psy.ox.ac.uk (Emmanuel Pothos) Date: Mon, 12 Oct 1998 19:24:42 +0100 (BST) Subject: post doc position Message-ID: UNIVERSITY OF WALES, BANGOR RESEARCH ASSISTANT JOB ADVERT SALARY GRADE 1B - 15,735 - 18,275 A Research Officer is sought to work with Dr. Emmanuel Pothos and Professor Nick Chater on an exciting project relating to human categorisation. In the first instance, the project would involve computer implementation and analysis of an information theory model of classification, developed by Pothos and Chater. Subsequently, human experimental work would be designed to investigate the psychological plausibility of the project. Although the emphasis will be on the psychological implications of the theory, statistical applications will be also be pursued, with a view to addressing clustering, data mining problems. Candidates should have a high level of computer programming competence (preferably C, or Matlab); expertise in the statistics clustering literature, and human experimental skills are desirable but not necessary. Thus, we welcome applications with a Ph.D. or a good MSc in computer science, statistics, cognitive psychology and related disciplines. The post will be for one year commencing January 1999 (although this is flexible), held at the University of Wales, Bangor, one of the fastest growing departments in the UK, which received a 5A rating in the most recent RAE. Application forms and further particulars are available by contacting : Personnel Services, University of Wales, Bangor, Gwynedd, LL57 2DG. tel.: 01248-382926/388132. e-mail: pos020 at bangor.ac.uk. Please quote reference number 98/201 when applying. Closing date for applications: Monday 16th November 1998. Informal enquiries can be made by contacting Dr Emmanuel Pothos e.mail: e.pothos at bangor.ac.uk, or Professor Nick Chater e.mail: nick.chater at warwick.ac.uk Committed to Equal Opportunities From harnad at coglit.soton.ac.uk Tue Oct 13 04:25:50 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Tue, 13 Oct 1998 09:25:50 +0100 (BST) Subject: BBS Call (Neuron Doctrine) + 3 important announcements Message-ID: 3 important announcements, followed by BBS Call for Commentators (Gold/Stoljar: Neuron Doctrine): ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers in this last month. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers on their Home-Servers as well as on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ --------------------------------------------------------------------- Below is the abstract of a forthcoming BBS target article on: A NEURON DOCTRINE IN THE PHILOSOPHY OF NEUROSCIENCE by Ian Gold & Daniel Stoljar This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ A NEURON DOCTRINE IN THE PHILOSOPHY OF NEUROSCIENCE Ian Gold Institute of Advanced Studies, Australian National University, Canberra ACT 0200, Australia iangold at coombs.anu.edu.au Daniel Stoljar Department of Philosophy and Institute of Cognitive Science, University of Colorado, Boulder 80309 stoljar at colorado.edu and Institute of Advanced Studies, Australian National University Canberra ACT 0200, Australia dstoljar at coombs.anu.edu.au KEYWORDS: Churchlands, classical conditioning, cognitive neuroscience, Kandel, learning, materialism, mind, naturalism, neurobiology, neurophilosophy, philosophy of neuroscience, psychology, reduction, theoretical unification ABSTRACT: Many neuroscientists and philosophers endorse a view about the explanatory reach of neuroscience which we will call the neuron doctrine to the effect that the framework for understanding the mind will be developed by neuroscience; or, as we will put it, that a successful theory of the mind will be solely neuroscientific. It is a consequence of this view that the sciences of the mind that cannot be expressed by means of neuroscientific concepts alone count as indirect sciences that will be discarded as neuroscience matures. This consequence is what makes the doctrine substantive, indeed, radical. We ask, first, what the neuron doctrine means and, second, whether it is true. In answer to the first question, we distinguish two versions of the doctrine. One version, the trivial neuron doctrine, turns out to be uncontroversial but unsubstantive because it fails to have the consequence that the non-neuroscientific sciences of the mind will eventually be discarded. A second version, the radical neuron doctrine, does have this consequence, but, unlike the first doctrine, is highly controversial. We argue that the neuron doctrine appears to be both substantive and uncontroversial only as a result of a conflation of these two versions. We then consider whether the radical doctrine is true. We present and evaluate three arguments for it, based either on general scientific and philosophical considerations or on the details of neuroscience itself; arguing that all three fail. We conclude that the evidence fails to support the radical neuron doctrine. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.gold.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.gold ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.gold To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.gold When you have the file(s) you want, type: quit From bert at mbfys.kun.nl Tue Oct 13 08:29:51 1998 From: bert at mbfys.kun.nl (Bert Kappen) Date: Tue, 13 Oct 1998 14:29:51 +0200 (MET DST) Subject: PhD position available Message-ID: <199810131229.OAA10471@bertus.mbfys.kun.nl> PhD position for neural network research at SNN, University of Nijmegen, the Netherlands. Background: The SNN neural networks research group at the university of Nijmegen consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and graphical models. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focussed on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks; applications in medical diagnosis and prediction of consumer behaviour. Research project: The modern view on AI, neural networks as well as parts of statistics, is to describe learning and reasoning using a probabilistic framework. A particular advantage of the probabilistic framework is that domain knowledge in the form of rules and data can be easily combined in model construction. The main drawback is that inference and learning in large probabilistic networks is intractible. Therefore, robust approximation schemes are needed to apply this technology to large real world applications. The topic of research is to develop learning rules for neural networks and graphical models using techniques from statistical mechanics. Requirements: The candidate should have a strong background in theoretical physics or mathematics. The PhD position: Appointment will be full-time for four years. Gross salary will be NLG 2184 per month in the first year, increasing to NLG 3899 in the fourth year. More information: Details about the research can be found at http://www.mbfys.kun.nl/SNN or by contacting dr. H.J. Kappen (bert at mbfys.kun.nl, ++31243614241). Applications should include a curriculum vitae and a statement of the candidate's professional interests and goals, and one copy of recent work (e.g., thesis, article). Interested applicants should express their interest by email to bert at mbfys.kun.nl before October 17. Full applications should be sent before October 25 to the Personnel Department of the Faculty of Natural Sciences, University of Nijmegen, Toernooiveld 1, 6525 ED Nijmegen, vacancy number 98-52. From alimoglu at cns.bu.edu Tue Oct 13 16:20:39 1998 From: alimoglu at cns.bu.edu (Fevzi Alimoglu) Date: Tue, 13 Oct 1998 16:20:39 -0400 (EDT) Subject: Call For Papers: TAINN'99 Message-ID: CALL FOR PAPERS TAINN'99 The Eighth Turkish Symposium on Artificial Intelligence and Neural Networks June 23-25 1999 Bogazici University, Istanbul, TURKEY ORGANIZED BY: Bogazici University BACKGROUND: TAINN^Ò99 is the eighth in a series of symposia intended to bring together researchers from the fields of artificial intelligence and neural networks. The symposium will be held in the historical city of Istanbul, which has been the capital of two empires, and is the economic and cultural center of modern Turkey. SCOPE: AI Architectures, Artificial Life, Automated Modeling, Automated Reasoning, Bayesian Learning and Belief Networks, Control, Causality, Cognitive Modeling, Common Sense Reasoning, Computer-Aided Education, Decision Trees, Design, Diagnosis, Discourse, Discovery, Distributed AI, Expert Systems, Fuzzy Logic, Game Playing, Genetic Algorithms, Geometric or Spatial Reasoning, Hardware Realizations of Neural Networks and Fuzzy Systems, Information Retrieval, Knowledge Acquisition, Knowledge Representation, Logic Programming, Machine Discovery, Machine Learning, Machine Translation, Mathematical Foundations, Multi-agent Systems, Multimedia, Natural Language Processing, Neural Networks, Non-monotonic Reasoning, Pattern Recognition, Perception, Philosophical Foundations, Planning, Problem Solving, Qualitative Reasoning, Real-Time Systems, Reinforcement Learning, Robotics, Scheduling, Search, Simulation, Software Agents, Speech Understanding, Symbolic Computation, Temporal Reasoning, Vision. PAPER SUBMISSION: Original papers from all areas listed above are solicited. The symposium languages are Turkish and English. All submitted papers will be refereed on the basis of quality, significance, and clarity by program committee members and other reviewers. All accepted papers will be published in the symposium proceedings which will be available at the meeting. The symposium will consist of technical presentations, invited talk(s) and poster sessions. Prospective authors are invited to submit full papers exclusively in electronic media in the format of postscript files attached to e-mail messages. Papers are limited to 10 pages, including figures and references, in single-spaced one-column format using a font size of 11 points. See the symposium webpage for style information. PROGRAM COMMITTEE: L. Akarun Turkey H. L. Akin Turkey V. Akman Turkey F. Alpaslan Turkey E. Alpaydin Turkey V. Atalay Turkey I. Aybay N. Cyprus C. Bozsahin Turkey I. Bratko Slovenia I. Cicekli Turkey T. Ciftcibasi N. Cyprus D. Davenport Turkey C. Dichev Bulgaria G. W. Ernst USA A. Fatholahzadeh France M. Guler Turkey F. Gurgen Turkey H. A. Guvenir Turkey C. Guzelis Turkey U. Halici Turkey H. Hamburger USA S. Kocabas Turkey F. Masulli Italy K. Oflazer Turkey E. Oztemel Turkey Y. Ozturk USA J.-I. Tsujii UK, Japan G. Ucoluk Turkey N. Yalabik Turkey F. T. Yarman-Vural Turkey A. Yazici Turkey PROGRAM CO-CHAIRS: A. C. Cem Say say at boun.edu.tr Gunhan Dundar dundar at boun.edu.tr DEADLINES: Submission: February 1, 1999 Notification: April 5, 1999 Final Copy Due: May 3, 1999 For more information, frequently access the URL: http://www.cmpe.boun.edu.tr/~tainn99/ CONTACTS: TAINN'99 Department of Computer Engineering Boðaziçi University, Bebek 80815, Ýstanbul, TURKEY e-mail: say at boun.edu.tr phone: 90-212-2631540 Ext. 1628 (leave message) fax: 90-212-2872461 From skumar at cs.utexas.edu Wed Oct 14 11:41:53 1998 From: skumar at cs.utexas.edu (Shailesh Kumar) Date: Wed, 14 Oct 1998 10:41:53 -0500 (CDT) Subject: thesis on Q-learning for packet routing Message-ID: <199810141541.KAA22300@plucky.cs.utexas.edu> Dear Connectionists, My thesis on packet routing with Q-learning is available at the UTCS Neural Networks Research Group web site http://www.cs.utexas.edu/users/nn. - Shailesh CONFIDENCE BASED DUAL REINFORCEMENT Q-ROUTING: AN ON-LINE ADAPTIVE NETWORK ROUTING ALGORITHM Shailesh Kumar Master's Thesis; Technical Report AI98-267, Department of Computer Sciences, The University of Texas at Austin (108 pages) FTP-host: ftp.cs.utexas.edu FTP-filename: pub/neural-nets/papers/kumar.msthesis.ps.Z http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#kumar.msthesis.ps.Z Efficient routing of information packets in dynamically changing communication networks requires that as the load levels, traffic patterns and topology of the network change, the routing policy also adapts. Making globally optimal routing decisions would require a central observer/controller with complete information about the state of all nodes and links in the network, which is not realistic. Therefore, the routing decisions must be made locally by individual nodes (routers) using only local routing information. The routing information at a node could be estimates of packet delivery time to other nodes via its neighbors or estimates of queue lengths of other nodes in the network. An adaptive routing algorithm should efficiently explore and update routing information available at different nodes as it routes packets. It should continuously evolve efficient routing policies with minimum overhead on network resources. In this thesis, an on-line adaptive network routing algorithm called Confidence-based Dual Reinforcement Q-Routing (CDRQ-routing), based on the Q learning framework, is proposed and evaluated. In this framework, the routing information at individual nodes is maintained as Q value estimates of how long it will take to send a packet to any particular destination via each of the node's neighbors. These Q values are updated through exploration as the packets are transmitted. The main contribution of this work is the faster adaptation and the improved quality of routing policies over the Q-Routing. The improvement is based on two ideas. First, the quality of exploration is improved by including a confidence measure with each Q value representing how reliable the Q value is. The learning rate is a function of these confidence values. Secondly, the quantity of exploration is increased by including backward exploration into Q learning. As a packet hops from one node to another, it not only updates a Q value in the sending node (forward exploration similar to Q-Routing), but also updates a Q value in the receiving node using the information appended to the packet when it is sent out (backward exploration). Thus two Q value updates per packet hop occur in CDRQ-Routing as against only one in Q-routing. Certain properties of forward and backward exploration that form the basis of these update rules are stated and proved in this work. Experiments over several network topologies, including a 36 node irregular grid and 128 node 7-D hypercube, indicate that the improvement in quality and increase in quantity of exploration contribute in complementary ways to the performance of the overall routing algorithm. CDRQ-Routing was able to learn optimal shortest path routing at low loads and efficient routing policies at medium loads almost twice as fast as Q-Routing. At high load levels, the routing policy learned by CDRQ-Routing was twice as good as that learned by Q-Routing in terms of average packet delivery time. CDRQ-Routing was found to adapt significantly faster than Q-Routing to changes in network traffic patterns and network topology. The final routing policies learned by CDRQ-Routing were able to sustain much higher load levels than those learned by Q-Routing. Analysis shows that exploration overhead incurred in CDRQ-Routing is less than 0.5% of the packet traffic. Various extensions of CDRQ-Routing namely, routing in heterogeneous networks (different link delays and router processing speeds), routing with adaptive congestion control (in case of finite queue buffers) and including predictive features into CDRQ-Routing have been proposed as future work. CDRQ-Routing is much superior and realistic than the state of the art distance vector routing and the Q-Routing algorithm. From moatl at cs.tu-berlin.de Thu Oct 15 12:38:39 1998 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Thu, 15 Oct 1998 18:38:39 +0200 Subject: PhD-position available Message-ID: <3626250F.69F3@cs.tu-berlin.de> COMPUTATIONAL NEUROSCIENCE PhD-POSITION AVAILABLE Dept. of Computer Science Technical University of Berlin A PhD-position in the field of computational neuroscience is now available at the neural information processing group (Prof. Klaus Obermayer), Dept. of Computer Science of the Technical University of Berlin. The project aims at the development of new neural and statistical methods for the analysis of data sets generated by optical imaging of intrinsic signals. This method utilizes neural-activity related changes in the light reflectance of neural tissue, which are mediated by variations in local blood-oxygenation, blood-flow, blood volume and light scattering. The raw data sets contain a superposition of several signal components, only some of which are related to neural activity with a sufficiently high spatial resolution. The methods to be developed include: (i) Theoretical studies of the image generation process by Monte-Carlo simulations of light propagation in neuronal tissue. (ii) Development and application of methods for adaptive denoising of the data sets and the extraction of stimulus-related signals from these data sets. Requirements: The candidate should have a degree in physics, mathematics, engineering or computer science and a strong background in theory. Appointment: The appointment will initially be for two years but can be extended by one more year. The salary is determined according to BAT IIa/2 corresponding to a net salary (after deduction of taxes) of approx. DM 1700.-- per month. More information: Details about the research can be found at http://www.ni.cs.tu-berlin.de or by contacting Prof. Klaus Obermayer (oby at cs.tu-berlin.de, ++49-30-314-73120). Applications should include CV, list of publications, a statement of the candidate's professional interests and goals, and one copy of recent work (e.g., thesis, article). Applicants should express their interest by email to oby at cs.tu-berlin.de before November 15. Full applications should be sent before November 30. 1998 to Prof. Dr. Klaus Obermayer, FR2-1, FB 13, Informatik, Technische Universitaet Berlin Franklinstrasse 28/29 D-10587 Berlin Germany From raetsch at zoo.brain.riken.go.jp Fri Oct 16 07:32:06 1998 From: raetsch at zoo.brain.riken.go.jp (raetsch@zoo.brain.riken.go.jp) Date: Fri, 16 Oct 1998 20:32:06 +0900 (JST) Subject: TR on Soft Margins for AdaBoost Message-ID: Dear Connectionists, A new paper on a Soft Margin approach for AdaBoost is available: ``Soft Margins for AdaBoost'', G. R\"atsch, T. Onoda, K.-R. M\"uller, NeuroColt2 TechReport NC-TR-1998-021. http://www.first.gmd.de/~raetsch/Neurocolt_Margin.ps.gz Comments and questions are welcome. Please contact me at raetsch at first.gmd.de. Thanks, Gunnar R\"atsch Abstract: Recently ensemble methods like AdaBoost were successfully applied to character recognition tasks, seemingly defying the problems of overfitting. This paper shows that although AdaBoost rarely overfits in the low noise regime it clearly does so for higher noise levels. Central for understanding this fact is the margin distribution and we find that AdaBoost achieves -- doing gradient descent in an error function with respect to the margin -- asymptotically a {\em hard margin} distribution, i.e. the algorithm concentrates its resources on a few hard-to-learn patterns (here an interesting overlap emerge to Support Vectors). This is clearly a sub-optimal strategy in the noisy case. We propose several regularization methods and generalizations of the original AdaBoost algorithm to achieve a Soft Margin -- a concept known from Support Vector learning. In particular we suggest (1) regularized AdaBoost$_{Reg}$ using the soft margin directly in a modified loss function and (2) regularized linear and quadratic programming (LP/QP-) AdaBoost, where the soft margin is attained by introducing slack variables. Extensive simulations demonstrate that the proposed regularized AdaBoost algorithms are useful and competitive for noisy data. From mkearns at research.att.com Fri Oct 16 16:14:11 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Fri, 16 Oct 1998 16:14:11 -0400 (EDT) Subject: NIPS*98 Program Message-ID: <199810162014.QAA24117@radish.research.att.com> Appended below is the full program for the main conference of NIPS*98, from Program Chair Sara Solla. It will also be posted shortly to the NIPS web site at http://www.cs.cmu.edu/Groups/NIPS/ At the web site you can also find detailed information about the tutorials preceding the main conference, and the workshops in Breckenride immediately following the main conference. Please be aware that on-line registration is available at the web site, and that the deadline for early registration is October 30. Michael Kearns NIPS*98 General Chair ********************************************************************** Program for NIPS 1998 Neural Information Processing Systems SUN NOV 29 ---------- 18:00-22:00 Registration MON NOV 30 ---------- 08:30-18:00 Registration 09:30-17:30 Tutorials 18:30 Reception and Conference Banquet 20:30 The laws of the WEB (Banquet talk) B. Huberman Xerox PARC TUE DEC 1 --------- Oral Session 1: 08:30 Statistics of visual images: neural representation and synthesis (Invited) E. Simoncelli New York University 09:20 Attentional modulation of human pattern discrimination psychophysics reproduced by a quantitative model (VS, Oral) L. Itti, J. Braun, D. Lee, C. Koch California Institute of Technology 09:40 Orientation, scale, and discontinuity as emergent properties of illusory contour shape (VS, Oral) K. Thornber, L. Williams NEC Research Institute, University of New Mexico 10:00 DTs: dynamic trees (AA, Spotlight) C. Williams, N. Adams Aston University Modeling stationary and integrated time series with autoregressive neural networks (LT, Spotlight) F. Leisch, A. Trapletti, K. Hornik Technical University of Vienna Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages (LT, Spotlight) W. Maass, E. Sontag Technical University of Graz, Rutgers University Semiparametric support vector and linear programming machines (AA, Spotlight) A. Smola, T. Friess, B. Schoelkopf GMD FIRST Blind separation of filtered source using state-space approach (AA, Spotlight) L. Zhang, A. Cichocki RIKEN Brain Science Institute 10:15-11:00 Break Oral Session 2: 11:00 The bias-variance tradeoff and the randomized GACV (AA, Oral) G. Wahba, X. Lin, F. Gao, D. Xiang, R. Klein, B. Klein University of Wisconsin-Madison, SAS Institute 11:20 Kernel PCA and de-noising in feature spaces (AA, Oral) S. Mika, B. Schoelkopf, A. Smola, K. Mueller, M Scholz, G. Raetsch GMD FIRST 11:40 Sparse code shrinkage: denoising by maximum likelihood estimation (AA162, Oral) A. Hyvaarinen, P. Hoyer, E. Oja Helsinki University of Technology 12:00-14:00 Lunch Oral Session 3: 14:00 Temporally asymmetric Hebbian learning, spike timing and neuronal response variability (Invited) L. Abbott Brandeis University 14:50 Information maximization in single neurons (NS, Oral) M. Stemmler, C. Koch California Institute of Technology 15:10 Multi-electrode spike sorting by clustering transfer functions (NS, Oral) D. Rinberg, H. Davidowitz, N. Tishby NEC Research Institute 15:30 Distributional population codes and multiple motion models (NS, Spotlight) R. Zemel, P. Dayan University of Arizona, Massachusetts Institute of Technology Population coding with correlated noise (NS, Spotlight) H. Yoon, H. Sompolinsky Hebrew University Bayesian modeling of human concept learning (CS, Spotlight) J. Tenenbaum Massachusetts Institute of Technology Mechanisms of generalization in perceptual learning (CS, Spotlight) Z. Liu, D. Weinshall NEC Research Institute, Hebrew University An entropic estimator for structure discovery (SP, Spotlight) M. Brand Mitsubishi Electric Research Laboratory 15:45-16:15 Break Oral Session 4: 16:15 The role of lateral cortical competition in ocular dominance development (NS, Oral) C. Piepenbrock, K. Obermayer Technical University of Berlin 16:35 Evidence for learning of a forward dynamic model in human adaptive control (CS, Oral) N. Bhushan, R. Shadmehr Johns Hopkins University 16:55-18:00 Poster Preview 19:30 Poster Session WED DEC 2 --------- Oral Session 5: 08:30 Computation by Cortical Modules (Invited) H. Sompolinsky Hebrew University 09:20 Learning curves for Gaussian processes (LT, Oral) P. Sollich University of Edinburgh 9:40 Mean field methods for classification with Gaussian processes (LT, Oral) M. Opper O. Winther Aston University, Niels Bohr Institute 10:00 Dynamics of supervised learning with restricted training sets (LT, Spotlight) A. Coolen, D. Saad King's College London, Aston University Finite-dimensional approximation of Gaussian processes (LT, Spotlight) G. Trecate, C. Williams, M. Opper University of Pavia, Aston University Inference in multilayer networks via large deviation bounds (LT, Spotlight) M. Kearns, L. Saul AT&T Labs Gradient descent for general reinforcement learning (CN, Spotlight) L. Baird, A. Moore Carnegie Mellon University Risk sensitive reinforcement learning (CN, Spotlight) R. Neuneier, O. Mihatsch Siemens AG 10:15-11:00 Break Oral Session 6: 11:00 VLSI implementation of motion centroid localization for autonomous navigation (IM, Oral) R. Etienne-Cummings, M. Ghani, V. Gruev Southern Illinois University 11:20 Improved switching among temporally abstract actions (CN, Oral) R. Sutton, S. Singh, D. Precup, B. Ravindran University of Massachusetts, University of Colorado 11:40 Finite-sample convergence rates for Q-learning and indirect algorithms (CN, Oral) M. Kearns, S. Singh AT&T Labs, University of Colorado 12:00-14:00 Lunch Oral Session 7: 14:00 Statistical natural language processing: better living through floating-point numbers (Invited) E. Charniak Brown University 14:50 Markov processes on curves for automatic speech recognition (SP, Oral) L. Saul, M. Rahim AT&T Labs 15:10 Approximate learning of dynamic models (AA, Oral) X. Boyen, D. Koller Stanford University 15:30 Learning nonlinear stochastic dynamics using the generalized EM algorithm (AA, Spotlight) Z. Ghahramani, S. Roweis University of Toronto, California Institute of Technology Reinforcement learning for trading systems (AP, Spotlight) J. Moody, M. Saffell Oregon Graduate Institute Bayesian modeling of facial similarity (AP, Spotlight) B. Moghaddam, T. Jebara, A. Pentland Mitsubishi Electric Research Laboratory, Massachusetts Institute of Technology Computation of smooth optical flow in a feedback connected analog network (IM, Spotlight) A. Stocker, R. Douglas University and ETH Zurich Classification on pairwise proximity data (AA, Spotlight) T. Graepel, R. Herbrich, P. Bollmann-Sdorra, K. Obermayer Technical University of Berlin 15:45-16:15 Break Oral Session 8: 16:15 Learning from dyadic data (AA, Oral) T. Hofmann, J. Puzicha, M. Jordan Massachusetts Institute of Technology, University of Bonn 16:35 Classification in non-metric spaces (VS, Oral) D. Weinshall, D. Jacobs, Y. Gdalyahu NEC Research Institute, Hebrew University 16:55-18:00 Poster Preview 19:30 Poster Session THU DEC 3 --------- Oral Session 9: 08:30 Convergence of the wake-sleep algorithm (LT, Oral) S. Ikeda, S. Amari, H. Nakahara RIKEN Brain Science Institute 08:50 Learning a continuous hidden variable model for binary data (AA, Oral) D. Lee, H. Sompolinsky Bell Laboratories, Hebrew University 09:10 Direct optimization of margins improves generalization in combined classifiers (LT, Oral) L. Mason, P. Bartlett, J. Baxter Australian National University 09:30 A polygonal line algorithm for constructing principal curves (AA, Oral) B. Kegl, A. Krzyzak, T. Linder, K. Zeger Concordia University, Queen's University, UC San Diego 09:50-10:30 Break Oral Session 10: 10:30 Fast neural network emulation of physics-based models for computer animation (AP, Oral) R. Grzeszczuk, D. Terzopoulos, G. Hinton Intel Corporation, University of Toronto 10:50 Graphical models for recognizing human interactions (AP, Oral) N. Oliver, B. Rosario, A. Pentland Massachusetts Institute of Technology 11:10 Things that think (Invited) N. Gershenfeld Massachusetts Institute of Technology 12:00 End of main conference POSTERS ------- A theory of mean field approximation (LT, Poster) T. Tanaka Tokyo Metropolitan University Replicator equations, maximal cliques, and graph isomorphism (AA, Poster) M. Pelillo University of Venice Almost linear VC dimension bounds for piecewise polynomial networks (LT, Poster) P. Bartlett, V. Maiorov, R. Meir Australian National University, Technion On the optimality of incremental neural network algorithms (LT, Poster) R. Meir, V. Maiorov Technion Controlling the complexity of HMM systems by regularization (SP, Poster) C. Neukirchen, G. Rigoll Gerhard-Mercator-University Duisburg The belief in TAP (LT, Poster) Y. Kabashima, D. Saad Tokyo Institute of Technology, Aston University Source separation as a by-product of regularization (AA, Poster) S. Hochreiter, J. Schmidhuber Technical University of Munich, IDSIA Optimizing admission control while ensuring quality of service in multimedia networks via reinforcement learning (CN, Poster) T. Brown, H. Tong, S. Singh University of Colorado SMEM algorithm for mixture models (AA, Poster) N. Ueda, R. Nakano, Z. Ghahramani, G. Hinton NTT Communication Science Laboratories, University of Toronto Neuronal regulation implements efficient synaptic pruning (NS, Poster) G. Chechik, I. Meilijson, E. Ruppin Tel-Aviv University Reinforcement learning based on on-line EM algorithm (CN, Poster) M. Sato, S. Ishii ATR Human Information Processing Research Laboratories, Nara Institute of Science and Technology Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex (NS, Poster) P. Adorjan, K. Obermayer Technical University of Berlin Probabilistic visualization of high-dimensional binary data (AA, Poster) M. Tipping Microsoft Research Bayesian PCA (AA, Poster) C. Bishop Microsoft Research General-purpose localization of textured image regions (VS, Poster) R. Rosenholtz Xerox PARC Independent component analysis of intracellular calcium spike data (AP, Poster) K. Prank, J.Boerger, A. von zur Muehlen, G. Brabant, C. Schoefl Medical School Hannover Visualizing group structure (AA, Poster) M. Held, J. Puzicha, J. Buhmann University of Bonn Unsupervised clustering: the mutual information between parameters and observations (LT, Poster) D. Herschkowitz, J-P. Nadal Ecole Normale Superieure Using statistical properties of a labelled visual world to estimate scenes (VS, Poster) W. Freeman, E. Pasztor Mitsubishi Electric Research Laboratory Heeger's normalization, line attractor network and ideal observers (NS, Poster) S. Deneve, A. Pouget, P. Latham Georgetown University Learning Lie transformation groups for invariant visual perception (VS, Poster) R. Rao, D. Rudermann The Salk Institute Using collective intelligence to route Internet traffic (AP, Poster) D. Wolpert, K. Tumer, J. Frank NASA Ames Research Center Synergy and redundancy among brain cells of behaving monkeys (NS, Poster) I. Gat, N. Tishby Hebrew University, NEC Research Institute Where does the population vector of motor cortical cells point during reaching movements? (NS, Poster) P. Baraduc, E. Guigon, Y. Burnod Universite Pierre et Marie Curie Lazy learning meets the recursive least squares algorithm (AA, Poster) M. Birattari, G. Bontempi, H. Bersini Universite Libre de Bruxelles Variational approximations of graphical models using undirected graphs (LT, Poster) D. Barber, W. Wiegerinck University of Nijmegen Making templates rotationally invariant: an application to rotated digit recognition (AP, Poster) S. Baluja Carnegie Mellon University Probabilistic modeling for face orientation discrimination: learning from labeled and unlabeled data (AP, Poster) S. Baluja Carnegie Mellon University Maximum-likelihood continuity mapping (MALCOM): an alternative to HMMs (SP, Poster) D. Nix, J. Hogden Los Alamos National Laboratory Signal detection in noisy weakly-active dendrites (NS, Poster) A. Manwani, C. Koch California Institute of Technology An integrated vision sensor for the computation of optical flow singular points (IM, Poster) C. Higgins, C. Koch California Institute of Technology A micropower CMOS adaptive amplitude and shift invariant vector quantizer (IM, Poster) R. Coggins, R. Wang, M. Jabri University of Sidney Coding time-varying signals using sparse, shift-invariant representations (SP, Poster) M. Lewicki, T. Sejnowski The Salk Institute Unsupervised classification with non-Gaussian mixture models using ICA (AA, Poster) T-W. Lee, M. Lewicki, T. Sejnowski The Salk Institute Discovering hidden features with Gaussian processes regression (AA, Poster) F. Vivarelli, C. Williams Aston University Adding constrained discontinuities to Gaussian process models of wind fields (AP, Poster) D. Cornford, I. Nabney, C. Williams Aston University General bounds on Bayes errors for regression with Gaussian processes (LT, Poster) M. Opper, F. Vivarelli Aston University Efficient Bayesian parameter estimation in large discrete domains (AA, Poster) N. Friedman, Y. Singer UC Berkeley, AT&T Labs Support vector machines applied to face recognition (VS, Poster) J. Phillips National Institute of Standards and Technology A principle for unsupervised hierarchical decomposition of visual scenes (CS, Poster) M. Mozer University of Colorado Dynamically adapting kernels in support vector machines (LT, Poster) N. Cristianini, C. Campbell, J. Shawe-Taylor University of Bristol, University of London Graph matching for shape retrieval (AP, Poster) B. Huet, A. Cross, E. Hancock University of York Experiments with a memoryless algorithm which learns locally optimal stochastic policies for partially observable Markov decision processes J. Williams, S. Singh University of Colorado Basis selection for wavelet regression (AA, Poster) K. Wheeler NASA Ames Research Center Non-linear PI control inspired by biological control systems (CN, Poster) L. Brown, G. Gonye, J. Schwaber E.I. DuPont deNemours Utilizing time: asynchronous binding (CS, Poster) B. Love Northwestern University Discontinuous recall transitions induced by competition between short- and long-range interactions in recurrent networks (LT, Poster) N. Skantzos, C. Beckmann, A. Coolen King's College London On-line learning with restricted training sets: exact solution as benchmark for general theories (LT, Poster) H. Rae, P. Sollich, A. Coolen King's College London, University of Edinburgh Phase diagram and storage capacity of sequence storing neural networks (LT, Poster) A. Duering, A. Coolen, D. Sherrington Oxford University, King's College London Exploratory data analysis using radial basis function latent variable models (AA, Poster) A. Marrs, A. Webb DERA Spike-based compared to rate-based Hebbian learning (NS, Poster) R. Kempter, W. Gerstner, L. van Hemmen Technical University of Munich, Swiss Federal Institute of Technology Using analytic QP and sparseness to speed training of support vector machines (AA, Poster) J. Platt Microsoft Research Optimizing classifiers for imbalanced training sets (LT, Poster) G. Karakoulas, J. Shawe-Taylor Canadian Imperial Bank of Commerce, University of London Classification with linear threshold functions and the linear loss (LT, Poster) C. Gentile, M. Warmuth University of Milan, UC Santa Cruz Barycentric interpolators for continuous space & time reinforcement learning (CN, Poster) R. Munos, A. Moore Carnegie Mellon University Learning instance-independent value functions to enhance local search (CN, Poster) R. Moll, A. Barto, T. Perkins, R. Sutton University of Massachusetts Scheduling straight-line code using reinforcement learning and rollouts (AP, Poster) A. McGovern, E. Moss University of Massachusetts Global optimization of neural network models via sequential sampling (AA, Poster) J. de Freitas, M. Niranjan, A. Doucet, A. Gee Cambridge University Active noise canceling using analog neuro-chip with on-chip learning capability (IM, Poster) J-W. Cho, S-Y. Lee Korea Advanced Institute of Science and Technology A reinforcement learning algorithm in partially observable environments using short-term memory (CN, Poster) N. Suematsu, A. Hayashi Hiroshima City University Tight bounds for the VC-dimension of piecewise polynomial networks (LT, Poster) A. Sakurai Japan Advanced Institute of Science and Technology Applications of multi-resolution neural networks to mammography (AP, Poster) P. Sajda, C. Spence Sarnoff Corporation Restructuring sparse high dimensional data for effective retrieval (AA, Poster) C. Isbell, P. Viola Massachusetts Institute of Technology Vertex identification in high energy physics experiments (AP, Poster) G. Dror, H. Abramowicz, D. Horn The Academic College of Tel-Aviv-Yaffo, Tel-Aviv University A model for associative multiplication (CS, Poster) G. Christianson, S. Becker McMaster University Robot docking using mixtures of Gaussians (AP, Poster) M. Williamson, R. Murray-Smith, V. Hansen Massachusetts Institute of Technology, Technical University of Denmark, Daimler-Benz Computational differences between asymmetrical and symmetrical networks (LT, Poster) Z. Li, P. Dayan Massachusetts Institute of Technology A V1 model of pop out and asymmetry in visual search (VS, Poster) Z. Li Massachusetts Institute of Technology Modifying the parti-game algorithm for increased robustness, higher efficiency and better policies (CN, Poster) M. Al-Ansari, R. Williams Northeastern University Hierarchical ICA belief networks (AA, Poster) H. Attias UC San Francisco Exploring unknown environments with real-time heuristic search (CN, Poster) S. Koenig Georgia Institute of Technology Multiple paired forward-inverse models for human motor learning and control (CS, Poster) M. Haruno, D. Wolpert, M. Kawato ATR Human Information Processing Research Laboratories, University College London Learning to find pictures of people (VS, Poster) S. Ioffe, D. Forsyth UC Berkeley Facial memory is kernel density estimation (almost) (CS, Poster) M. Dailey, G. Cottrell, T. Busey UC San Diego, Indiana University Image statistics and cortical normalization models (NS, Poster) E. Simoncelli, O. Schwartz New York University Probabilistic sensor fusion (VS, Poster) R. Sharma, T. Leen, M. Pavel Oregon Graduate Institute Semi-supervised support vector machines (AA, Poster) K. Bennett, A. Demiriz Rensselaer Polytechnic Institute Optimizing correlation algorithms for hardware-based transient classification (IM, Poster) R. Edwards, G. Cauwenberghs, F. Pineda Johns Hopkins University On-line and batch parameter estimation of Gaussian mixtures based on the relative entropy (AA, Poster) Y. Singer, M. Warmuth AT&T Labs, UC Santa Cruz Very fast EM-based mixture model clustering using multiresolution kd-trees (AA, Poster) A. Moore Carnegie Mellon University Perceiving without learning: from spirals to inside/outside relations (CS, Poster) K. Chen, D. Wang Ohio State University A high performance k-NN classifier using a binary correlation matrix memory (IM, Poster) P. Zhou, J. Austin, J. Kennedy University of York Neural networks for density estimation (AA, Poster) M. Magdon-Ismail, A. Atiya California Institute of Technology Coordinate transformation learning of hand position feedback controller by using change of position error norm (CN, Poster) E. Oyama, S. Tachi University of Tokyo A randomized algorithm for pairwise clustering (AA, Poster) Y. Gdalyahu, D. Weinshall, M. Werman Hebrew University Familiarity discrimination of radar pulses (AP, Poster) E. Granger, S. Grossberg, M. Rubin, W. Streilein Ecole Polytechnique de Montreal, Boston University GLS: a hybrid classifier system based on POMDP research (CN, Poster) A. Hayashi, N. Suematsu Hiroshima City University Visualizing and analyzing single-trial event-related potentials (NS, Poster) T-P. Jung, S. Makeig, M. Westerfield, J. Townsend, E. Courchesne, T. Sejnowski The Salk Institute, Naval Health Research Center, UC San Diego Example based image synthesis of articulated figures (VS, Poster) T. Darrell Interval Research Maximum conditional likelihood via bound maximization and the CEM algorithm (AA, Poster) T. Jebara, A. Pentland Massachusetts Institute of Technology Boxlets: a fast convolution algorithm for signal processing and neural networks (AA, Poster) P. Simard, L. Bottou, P. Haffner, Y. LeCun AT&T Labs Learning multi-class dynamics A. Blake, B. North, M. Isard Oxford University Fisher scoring and a mixture of modes approach for approximate inference and learning in nonlinear state space models T. Briegel, V. Tresp Siemens AG Shrinking the tube: a new support vector regression algorithm (LT, Poster) B. Schoelkopf, P. Bartlett, A. Smola, R. Williamson GMD FIRST, Australian National University Regularizing AdaBoost (AA, Poster) G. Raetsch, T. Onoda, K. Mueller GMD FIRST A neuromorphic monaural sound localizer (IM, Poster) J. Harris, C-J. Pu, J. Principe University of Florida Convergence rates of algorithms for perceptual organization: detecting visual contours (AA, Poster) A. Yuille, J. Coughlan Smith-Kettlewell Eye Research Institute Minutemax: a fast approximation for minimax learning (VS, Poster) J. Coughlan, A. Yuille Smith-Kettlewell Eye Research Institute Call-based fraud detection in mobile communication networks using a hierarchical regime-switching model (AP, Poster) J. Hollmen, V. Tresp Helsinki University of Technology, Siemens AG The effect of eligibility traces on finding optimal memoryless policies in partially observable Markovian decision processes (CN, Poster) J. Loch University of Colorado Exploiting generative models in discriminative classifiers T. Jaakkola, D. Haussler UC Santa Cruz Complex cells as cortically amplified simple cells (NS, Poster) F. Chance, S. Nelson, L. Abbott Brandeis University Least absolute shrinkage is equivalent to quadratic penalization (AA, Poster) Y. Grandvalet, S. Canu Universite de Technologie de Compiegne Analog VLSI cellular implementation of the boundary contour system (IM, Poster) G. Cauwenberghs, J. Waskiewicz Johns Hopkins University Learning mixture hierarchies (AA, Poster) N. Vasconcelos, A. Lippman Massachusetts Institute of Technology Learning macro-actions in reinforcement learning (CN, Poster) J. Randlov Niels Bohr Institute From Dave_Touretzky at cs.cmu.edu Sun Oct 18 19:41:57 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Sun, 18 Oct 1998 19:41:57 -0400 Subject: CONNECTIONISTS policy on spam (unsolicited commercial email) Message-ID: <90.908754117@skinner.boltz.cs.cmu.edu> Recently I have seen a rash of conference announcements appearing in my personal mailbox, often from conferences I've never heard of before. There seems to be a trend, mainly in Europe but also sometimes in the US, to advertise academic conferences by spamming, as well as by the more socially acceptable means of Usenet newsgroup postings and messages to private mailing lists. The policy of the CONNECTIONISTS list is that we will not accept any announcements for conferences that engage in email spamming of personal mailboxes. Let me be clear about what "spamming" means. There is nothing wrong with a conference organizer sending email announcements to persons who have an established relationship with that conference, i.e., attendees from prior years, authors of papers submitted to the conference, or anyone who has written the conference asking for information. However, if a conference organizer uses the email addresses of people who have no prior relationship to the conference, e.g., people who merely posted to a newsgroup like comp.ai.neural-nets, or mentioned neural networks on their personal home page, or are known in the field from their publications in other conferences, that is spamming. Those people have not given even implicit permission to place their names on a mailing list. Sending unsolicited mail to such people is antisocial behavior, even if the message contains instructions for how to get off the mailisg list. The Connectionists list operates with a "zero tolerance" policy toward spamming. One of the reasons the list is moderated is the need to filter spam. Neural network conference announcements are NOT spam when sent to this mailing list; they are entirely apporpriate. But we will not cooperate with any conference that spams personal mailboxes. For our European friends: the spam problem is currently far worse in the US than in Europe. Bills have been proposed in the US Congress to make spamming illegal, but none have passed yet. It is very important that we not create loopholes for "acceptable" spam, such as academic conference announcements, as such loopholes will undermine the legal argument that sending ANY email solicitation to private individuals without their permission should be banned. (Due to freedom of speech considerations, if conference announcements are considered "okay", then porno spam and advertisements for financial scams also have to be allowed. And we don't want that.) Anyone who wants to learn more about the spam problem and how to fight it is invited to visit the following web sites: CAUCE -- Coalition Against Unsolicited Commercial Email http://www.cauce.org and especially http://www.cauce.org/resources.html Fight Spam on the Internet! http://spam.abuse.net/ -- Dave Touretzky, Connectionists moderator From Nigel.Goddard at ed.ac.uk Mon Oct 19 10:01:56 1998 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Mon, 19 Oct 1998 15:01:56 +0100 Subject: Computational Functional MRI positions Message-ID: <22599.199810191426@canna.dcs.ed.ac.uk> Please forward to interested individuals University of Edinburgh Centre for Interactive Image Analysis Research Fellows in Computational Functional MRI This is an exciting new venture to visualise the working of the human brain during cognitive tasks using a new research MR scanner. The Centre for Interactive Image Analysis is run by the SHEFC Brain Imaging Research Center for Scotland and the Institute for Adpative and Neural Computation at the University of Edinburgh. Working closely together, the two postholders will jointly engage in the following aspects of the Centre's work: * establish functional imaging on a research MRI scanner, including the stimulus presentation system, software and interfacing. * work with cognitive scientists and clinicians to define, establish and run suitable test paradigms on healthy volunteers and patients * establish networking, image-processing, analysis, visualisation and archiving facilities for scientific and clinical data obtained in fMRI studies. * work with cognitive scientists and clinicians to explore the uses of real-time and near-real-tiem analysis techniques in fMRI studies. The postholders will be encouraged to undertake original research in one or more of the following areas of fMRI: novel pulse sequences; statistical models of physical and physiological noise; cognitive modelling; parallel statistical and image-processing algorithms; visualisation techniques. Candidates should hold a higher degree in an appropriate subject and should be familiar with PC and/or Unix computing. Expertise in cognitive modelling, MRI, statistics, image analysis, and visualisation are highly desirable. At least one postholder will have experience with parallel computing on Unix platforms. Salary scale: 15,735-23,651 p.a. Both posts are tenable for 3 years initially with renewal subject to success in attracting further funding. For further details see http://www.cns.ed.ac.uk/ciia Closing date: 6 November 1998. Interviews are expected 16-20 November 1998. Further particulars and application details should be obtained from the Personnel Department, The University of Edinburgh, 1 Roxburgh Street, Edinburgh, EH8 9TB Tel. 0131 650 2511 (24 hour answering service) quoting reference number 896785. Informal enquiries may be made to: Dr Nigel Goddard Inst. for Adaptive and Neural Computation Tel: +44 131 650 3087 Fax: +44 131 650 6899 email: Nigel.Goddard at ed.ac.uk or to: Dr Ian Marshall Dept. Medical Physics Tel: +44 131 537 1661/2155 FAX: +44 131 537 1026 email: I.Marshall at ed.ac.uk From at at coglit.soton.ac.uk Mon Oct 19 12:41:33 1998 From: at at coglit.soton.ac.uk (Adriaan Tijsseling) Date: Mon, 19 Oct 1998 17:41:33 +0100 Subject: PhD Thesis now available as postscript (Conn. Models of Cat.) Message-ID: The following PhD thesis is available as postscript via http://cogito.psy.soton.ac.uk/~at/CALM/Title.ps http://cogito.psy.soton.ac.uk/~at/CALM/Abstract.ps http://cogito.psy.soton.ac.uk/~at/CALM/PhD.ps http://cogito.psy.soton.ac.uk/~at/CALM/Colors.ps Adriaan Tijsseling ------------------------ Abstract ------------------------ Connectionist Models of Categorization: A Dynamical View of Cognition by Adriaan Tijsseling The functional role of altered similarity structure in categorization is analyzed. 'Categorical Perception' (CP) occurs when equal-sized physical differences in the signals arriving at our sensory receptors are perceived as smaller within categories and larger between categories (Harnad, 1987). Our hypothesis is that it is by modifying the similarity between internal representations that successful categorization is achieved. This effect depends in part on the iconicity of the inputs, which induces a similarity preserving structure in the internal representations. Categorizations based on the similarity between stimuli are easier to learn than contra-iconic categorization; it is mainly to modify the latter in the service of categorization that the characteristic compression/separation of CP occurs. This hypothesis was tested in a series of neural net simulations of studies on category learning in human subject. The nets are first pre-exposed to the inputs and then given feedback on their performance. The behavior of the resulting networks was then analyzed and compared to human performance. Before it is given feedback, the network discriminates and categorizes input based on the inherent similarity of the input structure. With corrective feedback the net moves its internal representations away from category boundaries. The effect is that similarity of patterns that belong to different categories is decreased, while similarity of patterns from the same category is increased (CP). Neural net simulations make it possible to look inside a hypothetical black box of how categorization may be accomplished; it is shown how increased attention to one or more dimensions in the input and the salience of input features affect category learning. Moreover, the observed 'warping' of similarity space in the service of categorization can provide useful functionality by creating compact, bounded chunks (Miller, 1965) with category names that can then be combined into higher-order categories described by the symbol strings of natural language and the language of thought (Greco, Cangelosi, & Harnad, 1997). The dynamic models of categorization of the kind analyzed here can be extended to make them powerful models of neuro-symbolic processing (Casey, 1997) and a fruitful territory for future research. From harnad at coglit.soton.ac.uk Tue Oct 6 11:42:32 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Tue, 6 Oct 1998 16:42:32 +0100 (BST) Subject: Representation: Psycoloquy Call for Commentators Message-ID: Markman/Dietrich: Representation/Mediation The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY: There has been a growing sentiment in cognitive science, particularly among advocates of dynamical systems and situated action, that traditional approaches to representation should be abandoned. Calls for the elimination of representation, as well as previous attempts to defend representation, have unfortunately talked past each other for a lack of common ground. This target article attempts to provide a common ground for the debate over representation. Many proponents of representations are searching for a single representational system to serve as the basis of all cognitive models; this paper argues that multiple approaches to representation must coexist in cognitive models. We also hope to elicit discussion about what properties of representations are critical for cognitive models. Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.48 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.48.representation-mediation.1.markman ----------------------------------------------------------------------- psycoloquy.98.9.48.representation-mediation.1.markman Mon Oct 5 1998 ISSN 1055-0143 (68 paragraphs, 78 references, 1297 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1998 Arthur B. Markman & Eric Dietrich IN DEFENSE OF REPRESENTATION AS MEDIATION Arthur B. Markman Department of Psychology University of Texas Austin, TX 78712 markman at psy.utexas.edu http://www.psy.utexas.edu/psy/FACULTY/Markman/index.html Eric Dietrich PACCS Program in Philosophy Binghamton University Binghamton, NY dietrich at binghamton.edu http://www.binghamton.edu/philosophy/home/faculty/index.htm ABSTRACT: Some cognitive scientists have asserted that cognitive processing is not well modeled by classical notions of representation and process that have dominated psychology and artificial intelligence since the cognitive revolution. In response to this claim, the concept of a mediating state is developed. Mediating states are the class of information-carrying internal states used by cognitive systems, and as such are accepted even by those researchers who reject representations. The debate over representation, then, is actually one about what additional properties of mediating states are necessary for explaining cognitive processing. Five properties that can be added to mediating states are examined for their importance in cognitive models. KEYWORDS: compositionality, computation, connectionism, discrete states, dynamic Systems, explanation, information, meaning, mediating states, representation, rules, semantic Content symbols Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.48 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.48.representation-mediation.1.markman To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc From sue at soc.plym.ac.uk Tue Oct 20 10:15:00 1998 From: sue at soc.plym.ac.uk (Sue Denham) Date: Tue, 20 Oct 1998 15:15:00 +0100 Subject: Lecturership Post Available Message-ID: <1.5.4.32.19981020141500.0073f550@soc.plym.ac.uk> University of Plymouth School Of Computing Lecturer/Senior Lecturer in Computational Intelligence in Finance and Business Applications are invited for the above position. Candidates must have either: i) a PhD degree in the area of computational intelligence (neural computing or evolutionary computing) and an interest in, together with some experience of, the application of these computing technologies in finance, investment and business, or ii) a higher degree in a financial/business discipline and at least five years'experience of applying computational intelligence techniques in the financial/business sector. A record of research publication in this field is highly desirable. The person appointed will be expected: i) to lead the development of teaching and research in this field, working within the School's Centre for Neural and Adaptive Systems, and ii) to take a leading role in the design and promotion of a new MSc programme in Computational Intelligence in Finance and Business, which the School is planning to start within the next two years. Further details of the position are available from: Dr Sue Denham Centre for Neural and Adaptive Systems School of Computing University of Plymouth Plymouth PL4 8AA England tel: +44 17 52 23 26 10 fax: +44 17 52 23 25 40 e-mail: sue at soc.plym.ac.uk http://www.tech.plym.ac.uk/soc/research/neural/index.html From debodt at fin.ucl.ac.be Wed Oct 21 10:07:16 1998 From: debodt at fin.ucl.ac.be (de Bodt Eric) Date: Wed, 21 Oct 1998 16:07:16 +0200 Subject: ACSEG98 - Call for participation Message-ID: <043f01bdfcfc$1bc55fc0$84996882@pcdebodt.ucl.ac.be> CONNECTIONIST APPROACHES IN ECONOMICS AND MANAGEMENT SCIENCES FIFTH INTERNATIONAL MEETING COMPLEX DATA: MODELING AND ANALYSIS CALL FOR PARTICIPATION Since the beginning of the 80s, important advances have been made in developing diverse new approaches of bio-mimetic inspiration (neural nets, genetic algorithms, cellular automata.) These approaches are of prime interest for researcher both in Economics and in Management Sciences. The ACSEG International Meetings give the opportunity to assess the state-of-the-art in the domain, to delineate future developments, and to evidence the contribution of bio-mimetic methods to Economics and Management Sciences. They also allow the researchers to present their recent work, to exchange know-how, and to discuss the problems encountered in their research. The 1998 ASCEG International Meeting on Complex Data Modeling and Analysis Will take place at the Universite catholique de Louvain, November 20, 1998. Organizers are the research centers SAMOS (Universite de Paris 1 - Pantheon - Sorbonne), CEGF (Universite catholique de Louvain) and CeReFim (Facultes Universitaires Notre-Dame de la Paix). If interested, check the conference page http://mkb.fin.ucl.ac.be/Acseg98 (the list of accepted papers is presented as well as the program of the day) or write to: ACSEG98, Centre d'Etudes en Gestion Financiere, Institut d'Administration et de Gestion, Universite catholique de Louvain, 1 place des Doyens, 1348 Louvain-la-Neuve - Belgium (Fax. : + (32).10.47.83.24) for additional information. From d.mareschal at psychology.bbk.ac.uk Wed Oct 21 04:58:42 1998 From: d.mareschal at psychology.bbk.ac.uk (Denis Mareschal) Date: Wed, 21 Oct 1998 09:58:42 +0100 Subject: connectionist models of learning and development Message-ID: Dear all, Members of this list may be interested in the following new journal. There has recently been an increase in the amount of computational and especially connectionist modelling of learning and development in infancy and childhood. Many of the researchers in the field want to find a natural outlet for the publication of their work. Developmental Science is a journal that specifically solicits manuscripts reporting on computational and connectionist models of learning and development. Also of interest are empirical studies that test existing models. Apologies to those who may receive multiple copies of this message. Best Regards, Denis -------------------- INSERT TEXT -------------------------- The first issue of Developmental Science, the journal of the European Society for Developmental Psychology has now been published. Developmental Science represents the best of contemporary scientific developmental psychology both in the presentation of theory and in reporting new data, Developmental Science will include: * Comparative and biological perspectives * Connectionist and computational perspectives * Dynamical systems theory VOLUME 1, ISSUE 1 CONTENTS: Peer commentary article * Uniquely Primate, Uniquely Human - Michael Tomasello Peer commentaries on Tomasello: "Uniquely Primate, Uniquely Human" * Comment on Michael Tomasello's Uniquely Primate, Uniquely Human - Merlin Donald * The Shaping of Social Cognition in Evolution and Development - Andrew Whiten * Comment on "Uniquely Primate, Uniquely Human" - Marc D Hauser * Uniquely to what ends? - Jonas Langer * Simian Similarities, Schisms, and "Special Social Skills" - James R Anderson Reply by the author * Response to Commentators - Michael Tomasello Reports * A nonhuman primate's expectations about object motion and destination: the importance of self-propelled movement and animacy - Marc D Hauser * Development of Precision Grips in Chimpanzees - George Butterworth & Shoji Itakura * Development of Selective Attention in Young Infants: Enhancement and Attenuation of Startle Reflex by Attention - John E Richards * Visual attention in infants with perinatal brain damage: Evidence of the importance of anterior lesions - Mark H Johnson, Leslie A Tucker, Joan Stiles & Doris Trauner * Gravity Does Rule for Falling Events - Bruce M Hood * Imitation across Changes in Object Affordances and Social Context in 9-Month-Old Infants - Emmanuel Devouche * Object Individuation in Young Infants: Further Evidence with an Event-Monitoring Paradigm - Teresa Wilcox & Renee Baillargeon * Computational Evidence for the Foundations of Numerical Competence - Tony J Simon * Newborns Learn to Identify a Face in Eight/Tenths of a Second? - Gail Walton, Erika Armstrong & Thomas G R Bower Papers * Global Influences on the Development of Spatial and Object Perceptual Categorization Abilities: Evidence from Preterm Infants - Clay Mash, Paul C Quinn, Velma Dobson & Dana B Narter * A Computational Analysis of Conservation - Thomas R Shultz * We almost had a great future behind us: the contribution of non-linear dynamics to developmental-science-in-the-making - Paul van Geert VOLUME 1, ISSUE 2 CONTENTS: Article with peer commentary and a reply by the author. * Infant perseveration and implications for object permanence theories: A PDP model of the AB task - Y. Munakata Commentaries * Understanding the A not B error : Working memory vs reinforded response, or active trace vs latent trace - A. Diamond * Toward a general model of perseveration in infancy - R. Bailllargeon. & A. Aguiar * Commentary on Munakata - J. G. Bremner * The development of delayed response: parallel distributed processing lacks neural plausibility - S. Dehaene * On theory and modeling - J. Mandler * To reach or not to reach: that is the question - Denis Mareschal * Commentary on Munakata's theory of object permanence development - J. S. Reznick * Infant perseveration and implications for object permanence theories: A PDP model of the A not B task - J. Russell * Babies have bodies: Why Munakata's net fails to meet its own goals - L. B. Smith Reply by the author * Infant perseveration: rethinking data, theory and the role of modeling - Y. Munakata Reports * Special Section: Work from the Medical Research Council Cognitive Development Unit, London - John Morton, Uta Frith, Mark Johnson & Annette -Karmiloff-Smith * Is Dutch native English? Linguistic analysis by 2 month olds - A. Christophe & J. Morton * Object centred attention in 8 month olds - M. Johnson& R. O. Gilmore If you would like further information regarding Developmental Science, including notes for contributors and editorial information please log on to: Alternatively contact Rachel Manns at Blackwell Publishers by email on rmanns at blackwellpublishers.co.uk or by fax on +44 (0) 1865 381362 ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development Department of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 171 631-6582/6207 fax +44 171 631-6312 ================================================= From girosi at massa-intermedia.ai.mit.edu Thu Oct 22 12:34:21 1998 From: girosi at massa-intermedia.ai.mit.edu (Federico Girosi) Date: Thu, 22 Oct 98 12:34:21 EDT Subject: NEW BOOK by Partha Niyogi Message-ID: <9810221634.AA04234@massa-intermedia.mit.edu> NEW BOOK *** NEW BOOK *** NEW BOOK *** NEW BOOK *** NEW BOOK *** People might be interested in the following book on the relationship between learning, neural networks and generative grammar. Federico Girosi ----------------------------------------------------------------------- The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar Partha Niyogi (MIT and Bell Laboratories) [Kluwer Academic Publishers: ISBN 0792380819] Among other topics, this book brings together two important but very different learning problems within the same analytical framework. The first is the the problem of learning functional mappings using Neural Networks; the second is learning natural language grammars in the Principles and Parameters tradition of Chomsky. The two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. Grammars are boolean-valued, finite-dimensional, discrete (sympolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The objective of this book is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question -- how much information does it take to learn -- of both problems, it highlights their similarities and differences. It shows how "setting parameters " in the principles and parameters tradition of linguistic theory and "learning the connections" in neural networks are conceptually very similar problems and both have reasonable statistical formulations. At the same time, the results from learning theory are used to argue that both processes must be highly constrained for learning to happen. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. "The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar" is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science and statistics will find it particularly relevant. From Annette_Burton at Brown.edu Thu Oct 22 10:15:18 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Thu, 22 Oct 1998 10:15:18 -0400 Subject: Job Announcement Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring two Postdoctoral Research Associates. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that these fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and their interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to Steven Sloman, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The positions will begin September 1, 1999 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $45,000 per year. All materials must be received by Jan. 1, 1999, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT From kehagias at egnatia.ee.auth.gr Fri Oct 23 00:24:47 1998 From: kehagias at egnatia.ee.auth.gr (Thanasis Kehagias) Date: Thu, 22 Oct 1998 21:24:47 -0700 Subject: Data Allocation Message-ID: <199810221822.VAA25531@egnatia.ee.auth.gr> INTRODUCTION: With respect to my query regarding the DATA ALLOCATION problem, I got several interesting replies and biblio pointers. However, I feel that my original question has NOT been answered yet. I want to keep this message relatively brief. I first present a summary of the original question, then my own results up to date, then the responses by other people and some of my subsequent thoughts. At the end of this message you can find a short BIBLIOGRAPHY of related references. A more complete presentation and a larger BIBTEX file can be found at http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm ========================================================================= DATA ALLOCATION PROBLEM: The full description was given in a previous message on this list and can also be found at http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm . In short the problem is this: a time series is generated by alternate activation of two sources. The time series is observed, but the source process is not. It is required to find the source process. MY QUESTION: Suppose SOME online filter is used to separate the data on two classes, using incoming data to retrain the filter (and thus sharpen the separation criterion). What can be said about the CONVERGENCE of this filter to correct data allocation (alias separation or segmentation) using only VERY GENERAL assumptions about the filtering algorithm and the nature of the sources? MAIN RESULT: Here is the main result I have been so far able to obtain, stated VERY INFORMALLY: "THEOREM": Say that at time t the filter has accepted into its data pool M(t) samples from source1 and N(t) samples from source2. IF: the COMBINATION of: source1, source2, the data allocation criterion and the filter retraining algorithm satisfy certain SEPARABILITY conditions, THEN: as t goes to infinity, the ratio M(t)/N(t) goes either to zero or to one, with probability one. REMARK: This means that asymptotically, the filter will be trained on a data pool which contains predominantly either source1 data or source2 data. A bunch of (real) theorems along the above lines are stated and proved in Part III of "PREDICTIVE MODULAR NEURAL NETWORKS" (http://skir on.control.ee.auth.gr/~kehagias/thn/thn02b01.htm). Note that the conditions used in the above "Theorem" (as well as the true theorems) are quite general: the separability conditions do not refer to a particular algorithm. It follows that D. Ringach's remark (see below) is right to the point: some kind of SEPARABILITY assumption is necessary. SOME REMARKS BY OTHER PEOPLE: Some people responded by citing papers which use specific algorithms to solve the above problem. For instance, J. Kohlmorgen cited his own work, R. Rohwer mentioned the mixture model problem, , A. Storkey cited his own work and also papers by (a) Z. Ghahramani and G. Hinton, (b) A. Weigend et al. S. Waterhouse mentioned a range of possibilities (mixture models, mixtures of experts, HMMs, HM Trees) and proposed some possible algorithmic approaches; he also referred me to his WEB page (http://www.ultimode.com/stevew/) for related references. Finally, D. Ringach made a very significant observation, which I reproduce here: >I don't see how you can do anything unless you assume something about >the sources... Otherwise, how could you rule out the null hypothesis >that is a single source with the measured distribution of y(i)? MY RESPONSE: Most of the above people proposed specific ALGORITHMS to solve the problem. However, my orignal question was about: the data allocation CONVERGENCE problem, in an UNSUPERVISED ONLINE setting and at a GENERAL level (not related to specific algorithms). Most of the pointers I got contained no convergence analysis. At any rate, I am familiar with a number of similar algorithms for which ALGORITHM SPECIFIC convergence analyses ARE available (e.g. for the LVQ algorithm). I have attempted to give a GENERAL analysis in the PAPERS and BOOK referred to in my previous messages and in my home page (http://skiron.co ntrol.ee.auth.gr/~kehagias/thn/thn.htm). Also I present some relevant thoughts in (http://skiron. control.ee.auth.gr/~kehagias/thn/thn030.htm). BIBLIOGRAPHY: Here are a FEW references about the data allocation problem (alias segmentation or data separation problem). You can find a more complete BIBTEX file at http://skiron. control.ee.auth.gr/~kehagias/thn/thn0301.bib . These references are mostly about algorithms to effect data allocation, either in an offline or online fashion; usually (but not always) the question of convergence is not treated theoretically. I should stress that the list below, as well as the bib tex file at my home site are by no means an exhaustive bibliographic coverage. They are just lists of some papers I have read and enjoyed. ---------------------------------------------------------------------------- 1. Gharahmani, Z. and Hinton, G.E.: Switching State-Space Models. (1998). Submitted for Publication. 2. Jordan, M.I. and R. A. Jacobs``Hierarchical mixtures of experts and the EM algorithm.'' . Neural Computation, 6, 181-214, 1994. 3. Jordan, M.I. and L. Xu. ``Convergence results for the EM approach to mixtures of experts architectures'' Neural Networks, 8, 1409-1431. 4. Kohlmorgen, J. and M=FCller, K.-R. and Pawelzik, K. (1998), Analysis of Drifting Dynamics with Neural Network Hidden Markov Models, in NIPS '97: Advances in Neural Information Processing Systems 10, MIT Press, to appear in 1998. 5. Levin, E. "Hidden control neural architecture modeling of nonlinear time varying systems and its applications". IEEE Trans. on Neural Networks 4 (1993) pp. 109-116. 6. Pawelzik, K. and Kohlmorgen, J. and M=FCller, K.-R. (1996), Annealed Competition of Experts for a Segmentation and Classification of Switching Dynamics, Neural Computation 8, pp. 340-356. 7. Storkey, A.J. Gaussian Processes for switching regimes. ICANN98. 8. Waterhouse, S. and Tony Robinson. "Classification Using Hierarchical Mixtures of Experts". Presented at IEEE Conference on Neural Networks and Signal Processing, 1994 9. Waterhouse, Steve and Tony Robinson. "Constructive Methods for Mixtures of Experts" (UK Version). Presented at NIPS: Neural Information Processing Systems 1995. 10. Weigend, A.S. and M. Mangeas and A. N. Srivastava (1995) Nonlinear gated experts for time series: discovering regimes and avoiding overfitting. 11. Xu, L. and M. I. Jordan``On convergence properties of the EM Algorithm for Gaussian mixtures'' . Neural Computation, 8, 129-151, 1996. ___________________________________________________________________ Ath. Kehagias --Assistant Prof. of Mathematics, American College of Thessaloniki --Research Ass., Dept. of Electrical and Computer Eng. Aristotle Univ., Thessaloniki, GR54006, GREECE --email: kehagias at egnatia.ee.auth.gr, kehagias at ac.anatolia.edu.gr --web: http://skiron.control.ee.auth.gr/~kehagias/index.htm From ASJagath at ntu.edu.sg Fri Oct 23 03:49:48 1998 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Dr)) Date: Fri, 23 Oct 1998 15:49:48 +0800 Subject: graduate student opportunities Message-ID: <6665AC0C667ED11186E308002BB487E10366CF41@exchange2> > School of Applied Science > Nanyang Technological University > Singapore > > Graduate Research Opportunities in Neuroimaging and Neurocomputing > > These are exciting opportunities for graduate students to research in > neuroimaging and neurocomputing, which are available through a > collaborative research project between Singapore Gamma Knife Center, > Singapore General Hospital, and the School of Applied Science, Nanyang > Technological University. > > The functional MRI scans of human brain during cognitive tasks, obtained > at the MRI facility of the Singapore General Hospital will be analyzed > using neural network paradigms and statistical techniques to detect human > brain activation. Students will join a research group analyzing functional > MR images and time-series to interpret human brain activities and infer > neuronal events in cognitive experiments. The research positions are > initially available to complete Masters degree, and may be continued > towards Ph.D. degree depending on the performance in the first year. > > Research students will receive a monthly stipend above S$1400 depending on > the qualifications. > The positions will be available during the year 1999. > > Prospective students should contact Dr. Jagath Rajapakse for further > details and applications: > > Dr. Jagath C. Rajapakse > School of Applied Science > Nanyang Technological University > Nanyang Avenue, Singapore 639798 > Email: asjagath at ntu.edu.sg > Phone: +65 790 5802 > > > From jose at tractatus.rutgers.edu Fri Oct 23 10:56:26 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Fri, 23 Oct 1998 10:56:26 -0400 Subject: RUTGERS-NEWARK JUNIOR POSITION in COG SCI/COG NEURO Message-ID: <3630991A.4B8F432@tractatus.rutgers.edu> The Department of Psychology of Rutgers University-Newark Campus anticipates making ONE tenure-track appointment in Cognitive Science or Cognitive Neuroscience at the Assistant Professor level. The Psychology Department has made three related appointments in the last two years and is expanding rapidly in this area. Candidates should have an active research program in memory, learning, categorization, attention, action, high-level vision or language. Particular interest will exist in candidates that combine one or more of the research interests above with FUNCTIONAL IMAGING (e.g. fMRI. PET, or ERP). Psychology has an ongoing affiliation/collaboration with UMDNJ fMRI laboratory. Review of applications will begin on January 8, 1999 but will continue to be accepted until the position is filled. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 2 reprints to Professor S. J. Hanson, Chair, Department of Psychology ? Cognitive Science Search, Rutgers University, Newark, NJ 07102. Email enquiry?s can be made to cogsci at psychology.rutgers.edu, Also see http://www.psych.rutgers.edu From cindy at cns.bu.edu Thu Oct 22 15:45:21 1998 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Thu, 22 Oct 1998 15:45:21 -0400 Subject: May 1999 Conference Message-ID: <199810221945.PAA01781@retina.bu.edu> ***** CALL FOR PAPERS ***** THIRD INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 26-29, 1999 Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. CONFIRMED INVITED SPEAKERS INCLUDE: Andreas Andreou Randolph Blake Rodney Brooks Gail Carpenter Dario Floreano Joaquin Fuster Paolo Gaudiano Charles Gilbert Larry Gillick Steven Greenberg Stephen Grossberg Michael Hasselmo Joseph LeDoux John Lisman Ennio Mingolla Tomaso Poggio Daniel Schacter Shihab Shamma Richard Shiffrin Nobuo Suga David van Essen Steven Zucker There will be contributed oral and poster sessions on each day of the conference. CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * other * cognition, planning, and attention Contributed Abstracts must be received, in English, by January 29, 1999. Notification of acceptance will be given by February 28, 1999. A meeting registration fee of $45 for regular attendees and $30 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 15, 1999. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and a first and second choice from among the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to cover meeting travel and living costs. The deadline to apply for fellowship support is January 29, 1999. Applicants will be notified by February 28, 1999. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. REGISTRATION FORM Third International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 26, 1999 Meeting: May 27-29, 1999 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $70 Conference plus Tutorial (Regular) ( ) $45 Conference plus Tutorial (Student) ( ) $45 Conference Only (Regular) ( ) $30 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From cns-cas at cns.bu.edu Thu Oct 22 11:17:46 1998 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Thu, 22 Oct 1998 11:17:46 -0400 Subject: Faculty Opening CNS-BU Message-ID: <199810221517.LAA18379@cochlea.bu.edu> NEW FACULTY IN COGNITIVE AND NEURAL SYSTEMS AT BOSTON UNIVERSITY Boston University seeks an assistant or associate professor for its graduate Department of Cognitive and Neural Systems. The department offers an integrated curriculum of psychological, neurobiological, and computational concepts, models, and methods in the fields of computational neuroscience, connectionist cognitive science, and neural network technology in which Boston University is a leader. Candidates should have an outstanding research profile, preferably including extensive analytic or computational research experience in modeling a broad range of nonlinear neural networks, especially in one or more of the areas: vision and image processing, adaptive pattern recognition, cognitive information processing, speech and language, adaptive-sensory motor control, and neural network technology. Send a complete curriculum vitae and three letters of recommendation to Search Committee, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston University, Boston, MA 02215. Boston University is an Equal Opportunity/Affirmative Action employer. (please post) From lambri at ifi.unizh.ch Wed Oct 28 10:02:07 1998 From: lambri at ifi.unizh.ch (Dimitrios Lambrinos) Date: Wed, 28 Oct 1998 15:02:07 +0000 Subject: Ph.D. position at AILab, Zurich Message-ID: <363731EF.D0588F50@ifi.unizh.ch> ------------------------------------------ Position for a Ph.D. student in Biorobotics at the AILab, University of Zurich ------------------------------------------ A new Ph.D. student position is open at the Artificial Intelligence Laboratory, Dept. of Computer Science of the University of Zurich. Availability: Immediately or at earliest convenience. Continuing previous work conducted at the AILab, this research will be focusing on Biorobotics, i.e. on building autonomous agents based on biological findings, that are capable of robust visually guided behavior in complex real-world environments. Biorobotics, with its goal of bringing together biology/neurobiology, engineering and computer science has enormous potential. On the one hand it will change our perception of biological intelligence, on the other it will inspire and influence the way we design, build, and use information technology. The main task of the agents will be to safely navigate in complex environments. This will require scaling-up previous theoretical and practical work. The challenge will be, first to demonstrate that mechanisms which are thought to be employed by natural systems can be implemented in real-world artifacts, and second, that such mechanisms, though very parsimonious, are sufficient for achieving complex behavior. If the above challenges capture your interest, and you would to become a member of an international research team conducting transdisciplinary work, submit a curriculum vitae, statement of research interests, and names of three referees ASAP to: Corinne Maurer Dept. of Computer Science University of Zurich Winterthurerstrasse 190 CH - 8057 Zurich, Switzerland E-mail: maurer at ifi.unizh.ch Phone: 41 - 1 - 635 43 31 Fax: 41 - 1 - 635 68 09 Profile: Applicants should have an MSc degree, or equivalent, in one of the following areas: computer science, electrical or mechanical engineering, biology, neurobiology, physics, mathematics, cognitive science, or related disciplines. He/she should have good programming skills (C, C++, etc.) preferably experience with robot programming, knowledge of electronics, and strong scientific background. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab as well as other administrative tasks required by the computer science department. Financial: The salary will be according to the specification of the Swiss National Science Foundation. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. From smyth at sifnos.ics.uci.edu Wed Oct 28 17:41:05 1998 From: smyth at sifnos.ics.uci.edu (Padhraic Smyth) Date: Wed, 28 Oct 1998 14:41:05 -0800 Subject: Tenure track faculty position at UC Irvine Message-ID: <9810281441.aa26794@paris.ics.uci.edu> Dear Connectionists FYI, the tenure-track position advertised below encompasses research topics closely related to neural computation, such as computational statistics and computational biology. I would be grateful if you would pass this information along to any of your colleagues or students who may be interested. I am happy to answer any general questions about the department and UCI if you wish to contact me by email. Padhraic Smyth Associate Professor Information and Computer Science University of California, Irvine. smyth at ics.uci.edu Open Faculty Position in Information and Computer Science at UC Irvine http://www.ics.uci.edu/interfac.html The Department of Information and Computer Science (ICS) has a tenure-track position open in the general area of interdisciplinary applications of computing. Research emphases include areas such as computational statistics, scientific data visualization, computer graphics and animation, computational biology, medical informatics, information organization, storage, retrieval and visualization. The available position is at the assistant professor level, but exceptional candidates from all ranks will be considered. In all cases, we are looking for applicants with a Ph. D. degree in Computer Science or a related field, and strong research credentials as evidenced by scholarly publications. Applicants for senior positions must also demonstrate a proven track record in original research and teaching activities. The ICS Department is organized as an independent campus unit reporting to the Executive Vice Chancellor. It runs the second most popular major at UCI and has designed an undergraduate honors program that attracts the campus' most qualified students. ICS faculty are involved in the forefront of research in the emerging areas of the computer science discipline such as multimedia/embedded computing, knowledge-discovery in databases, bioinformatics and the role of information in computer science and society. The faculty has effective interdisciplinary ties to colleagues in biology, cognitive science, engineering, management, medicine, and the social sciences. The Department currently has 32 full-time faculty and 125 Ph.D. students involved in various research areas including computer science theory, artificial intelligence, networks and distributed systems, databases, multimedia systems, computer systems design, software/software engineering, human-computer interaction and computer-supported cooperative work. ICS at UC Irvine represents one of the fastest growing departments and a computer science program that builds upon our strengths in core as well as growth areas of computer science. Although UCI is a young university, it has attained remarkable stature in the past 3 decades. Two Nobel prizes were recently awarded to UCI faculty. UCI is located three miles from the Pacific Ocean near Newport Beach, approximately forty miles south of Los Angeles. Irvine is consistently ranked among the safest cities in the U.S. and has an exceptional public school system. The campus is surrounded by high-technology companies that participate in an active affiliates program. Both the campus and the area offer exciting professional and cultural opportunities. Mortgage and housing assistance are available including newly built, for-sale housing located on campus and within short walking distance from the department. Applicants should send a cover letter indicating that they are applying for the "interdisciplinary applications of computing faculty position, Position E", a CV, sample papers and contact information for five references to: ICS Faculty Position E c/o Joy Schuler> Department of Information and Computer Science University of California, Irvine Irvine, CA 92697-3425. Application screening will begin immediately upon receipt of curriculum vitae. Maximum consideration will be given to applications received by December 1, 1998. Salaries commensurate with experience. The University of California is an Equal Opportunity Employer, committed to excellence through diversity. From MA_S435 at kingston.ac.uk Thu Oct 8 08:17:14 1998 From: MA_S435 at kingston.ac.uk (Dimitris Tsaptsinos) Date: Thu, 8 Oct 1998 13:17:14 +0100 Subject: CFP - EANN99 Message-ID: International Conference on Engineering Applications of Neural Networks (EANN '99) Fifth International Conference on Engineering Applications of Neural Networks (EANN '99) Warsaw, Poland 13-15 September 1999 First Call for Papers The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. Summaries of two pages (about 1000 words) should be sent by e- mail. Various sessions are organised and the responsibilities for some of the sessions have been divided to some extent from this year. The two page abstracts related to the following areas may be sent directly to the person(s) coordinating the session. Control systems (E. Tulunay, etulunay at ed.eee.metu.edu.tr and A. Ruano, aruano at ualg.pt) Vision/Image processing (S. Draghici, sod at cs.wayne.edu) Hybrid systems (J. Fernandez de Canete, canete at ctima.uma.es and D. Tsaptsinos, D.Tsaptsinos at kingston.ac.uk) Process Engineering (R. Baratti, baratti at unica.it) Biomedical systems (W. Duch, duch at phys.uni.torun.pl) Metallurgy (P. Myllykoski, pirkka.myllykoski at hut.fi) Meteorology (C. Schizas, schizas at ucy.ac.cy) Papers which do not fit any of the above categories can be sent to the common address of the conference which is eann99 at phys.uni.torun.pl. Deadline: 15 February 1999 Format: in plain text format (for other formats please email eann99 at the previous address) Please mention two to four keywords. Submissions will be reviewed. For information on earlier EANN conferences see the www pages at http://www.abo.fi/~abulsari/EANN98.html. The local organisers are putting together a web page at http://www.phys.uni.torun.pl/eann99/ Notification of acceptance will be sent around 15 March. The papers will be expected by 15 April. All papers will be upto 6 pages in length. Authors are expected to register by 15 April. Organising committee R. Baratti, University of Cagliari, Italy L. Bobrowski, Polish Academy of Science, Poland A. Bulsari, Nonlinear Solutions Oy, Finland W. Duch, Nicholas Copernicus University, Poland J. Fernandez de Canete, University of Malaga, Spain A. Ruano, University of Algarve, Portugal D. Tsaptsinos, Kingston University, UK National program committee L. Bobrowski, IBIB Warsaw, A. Cichocki, RIKEN, Japan W. Duch, Nicholas Copernicus University T. Kaczorek, Warsaw Polytechnic J. Korbicz, Technical University of Zielona Gora L. Rutkowski, Czestochowa Polytechnic R. Tadeusiewicz, Academy of Mining and Metallurgy, Krakow Z. Waszczyszyn, Krakow Polytechnic International program committee (to be confirmed, extended) S. Cho, Pohang University of Science and Technology, Korea T. Clarkson, King's College, UK S. Draghici, Wayne State University, USA G. Forsgr?n, Stora Corporate Research, Sweden I. Grabec, University of Ljubljana, Slovenia A. Iwata, Nagoya Institute of Technology, Japan C. Kuroda, Tokyo Institute of Technology, Japan H. Liljenstr?m, Royal Institute of Technology, Sweden L. Ludwig, University of Tubingen, Germany M. Mohammadian, Monash University, Australia P. Myllykoski, Helsinki University of Technology, Finland A. Owens, DuPont, USA R. Parenti, Ansaldo Ricerche, Italy F. Sandoval, University of Malaga, Spain C. Schizas, University of Cyprus, Cyprus E. Tulunay, Middle East Technical University, Turkey S. Usui, Toyohashi University of Technology, Japan P. Zufiria, Polytechnic University of Madrid, Spain Electronic mail is not absolutely reliable, so if you have not heard from the conference secretariat after sending a summary, please contact again. You should receive an abstract number in a couple of days after the submission. Dr Dimitris Tsaptsinos Senior Lecturer School of Mathematics Kingston University Penhryn Road Kingston-Upon-Thames Surrey KT1 2EE From martym at cs.utexas.edu Fri Oct 30 20:14:39 1998 From: martym at cs.utexas.edu (martym@cs.utexas.edu) Date: Fri, 30 Oct 1998 19:14:39 -0600 Subject: connectionist NLP software, papers, web demos Message-ID: <199810310114.TAA17159@fez.cs.utexas.edu> Dear Connectionists: The following software package for connectionist natural language processing and papers based on it are available from the UTCS Neural Networks Research Group website http://www.cs.utexas.edu/users/nn. Live demos of the software, including the systems described in the papers, can be run remotely from the research description page http://www.cs.utexas.edu/users/nn/pages/research/nlp.html. ======================================================================= Software: MIR: RAPID PROTOTYPING OF NEURAL NETWORKS FOR SENTENCE PROCESSING http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#mir The MIR software package has been designed for rapid prototyping of typical architectures used in NLP research (such as SRN, RAAM, and SOM), that depend heavily on (one or more) lexicons, and it can be easily extended to handle other architectures as well. It has been designed to be simple to install, modify, and use, while retaining as much flexibility as possible. To this end, the package is written in C using the TCL/TK libraries. The package includes a number of basic commands and widgets, so that the user can quickly set up, train, and test various architectures and visualize their dynamics. High-level scripts with training/testing data are also provided as examples of running complete experiments with MIR. ======================================================================= Papers: ======================================================================= SARDSRN: A NEURAL NETWORK SHIFT-REDUCE PARSER Marshall R. Mayberry, III, and Risto Miikkulainen. Technical Report AI98-275, Department of Computer Sciences, The University of Texas at Austin, 1998 (12 pages). http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#mayberry.sardsrn.ps.Z Simple Recurrent Networks (SRNs) have been widely used in natural language tasks. SARDSRN extends the SRN by explicitly representing the input sequence in a SARDNET self-organizing map. The distributed SRN component leads to good generalization and robust cognitive properties, whereas the SARDNET map provides exact representations of the sentence constituents. This combination allows SARDSRN to learn to parse sentences with more complicated structure than can the SRN alone, and suggests that the approach could scale up to realistic natural language. ======================================================================= DISAMBIGUATION AND GRAMMAR AS EMERGENT SOFT CONSTRAINTS Risto Miikkulainen and Marshall R. Mayberry, III. In B. J. MacWhinney (editor), Emergentist Approaches to Language. Hillsdale, NJ: Erlbaum, in press (16 pages). http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#miikkulainen.emergent.ps.Z When reading a sentence such as "The diplomat threw the ball in the ballpark for the princess" our interpretation changes from a dance event to baseball and back to dance. Such on-line disambiguation happens automatically and appears to be based on dynamically combining the strengths of association between the keywords and the two word senses. Subsymbolic neural networks are very good at modeling such behavior. They learn word meanings as soft constraints on interpretation, and dynamically combine these constraints to form the most likely interpretation. On the other hand, it is very difficult to show how systematic language structures such as relative clauses could be processed in such a system. The network would only learn to associate them to specific contexts and would not be able to process new combinations of them. A closer look at understanding embedded clauses shows that humans are not very systematic in processing grammatical structures either. For example, "The girl who the boy who the girl who lived next door blamed hit cried" is very difficult to understand, whereas "The car that the man who the dog that had rabies bit drives is in the garage" is not. This difference emerges from the same semantic constraints that are at work in the disambiguation task. In this chapter we will show how the subsymbolic parser can be combined with high-level control that allows the system to process novel combinations of relative clauses systematically, while still being sensitive to the semantic constraints.