From mkearns at research.att.com Wed Sep 1 10:41:36 1999 From: mkearns at research.att.com (Michael J. Kearns) Date: Wed, 1 Sep 1999 10:41:36 -0400 (EDT) Subject: Paper on TD convergence available Message-ID: <199909011441.KAA11369@radish.research.att.com> The following paper is now available at http://www.research.att.com/~mkearns/papers/tdlambda.ps.Z ``Bias-Variance'' Error Bounds for Temporal Difference Updates Michael Kearns Satinder Singh AT&T Labs We give the first rigorous upper bounds on the error of temporal difference ($\td$) algorithms for policy evaluation as a function of the amount of experience. These upper bounds prove exponentially fast convergence, with both the rate of convergence and the asymptote strongly dependent on the length of the backups $k$ or the parameter $\lambda$. Our bounds give formal verification to the long-standing intuition that $\td$ methods are subject to a ``bias-variance'' trade-off, and they lead to schedules for $k$ and $\lambda$ that are predicted to be better than any fixed values for these parameters. We give preliminary experimental confirmation of our theory for a version of the random walk problem. From oby at cs.tu-berlin.de Thu Sep 2 10:33:01 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 2 Sep 1999 16:33:01 +0200 (MET DST) Subject: No subject Message-ID: <199909021433.QAA08015@pollux.cs.tu-berlin.de> Dear Connectionists, attached please find abstracts and preprint-locations of four papers about: 1. the application of Gold et al.'s (1995) matching method to the measurement of flow fields in fluid dynamics 2. Bayesian transduction 3. ICA and optical recording of brain activity 4. the role of cortical competition in visual cortical information processing Cheers Klaus =============================================================================== A new particle tracking algorithm based on deterministic annealing and alternative distance measures M. Stellmacher and K. Obermayer CS Department, Technical University of Berlin, Germany We describe a new particle tracking algorithm for the interrogation of double frame single exposure data which is obtained with particle image velocimetry. The new procedure is based on an algorithm which has recently been proposed by Gold et al. (1995) for solving point matching problems in statistical pattern recognition. For a given interrogation window, the algorithm simultaneously extracts: (1) the correct correspondences between particles in both frames and (2) an estimate of the local flow-field parameters. Contrary to previous methods, the algorithm determines not only the local velocity, but other local components of the flow field, for example rotation and shear. This makes the new interrogation method superior to standard methods in particular in regions with high velocity gradients (e.g. vortices or shear flows). We perform benchmarks with three standard particle image velocimetry (PIV) and particle tracking velocimetry (PTV) methods: cross-correlation, nearest neighbour search, and image relaxation. We show that the new algorithm requires less particles per interrogation window than cross-correlation and allows for much higher particle densities than the other PTV methods. Consequently, one may obtain the velocity field at high spatial resolution even in regions of very fast flows. Finally, we find that the new algorithm is more robust against out-of-plane noise than previously proposed methods. http://ni.cs.tu-berlin.de/publications/#journals to appear in: Experiments in Fluids ------------------------------------------------------------------------------- Bayesian Transduction Graepel, R. Herbrich, and K. Obermayer CS Department, Technical University of Berlin, Germany Transduction is an inference principle that takes a training sample and aims at estimating the values of a function at given points contained in the so-called working sample. Hence, transduction is a less ambitious task than induction which aims at inferring a functional dependency on the whole of input space. As a consequence, however, transduction provides a confidence measure on single predictions rather than classifiers, a feature particularly important for risk-sensitive applications. We consider the case of binary classification by linear discriminant functions (perceptrons) in kernel space. From the transductive point of view, the infinite number of perceptrons is boiled down to a finite number of equivalence classes on the working sample each of which corresponds to a polyhedron in parameter space. In the Bayesian spirit the posteriori probability of a labelling of the working sample is determined as the ratio between the volume of the corresponding polyhedron and the volume of version space. Then the maximum posteriori scheme recommends to choose the labelling of maximum volume. We suggest to sample version space by an ergodic billiard in kernel space. Experimental results on real world data indicate that Bayesian Transduction compares favourably to the well-known Support Vector Machine, in particular if the posteriori probability of labellings is used as a confidence measure to exclude test points of low confidence. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ------------------------------------------------------------------------------- Application of blind separation of sources to optical recording of brain activity H. Schner^1, M. Stetter^1, I. Schiel^1, J. Mayhew^2, J. Lund^3, N. McLoughlin^3, and K. Obermayer^1 1: CS Department, Technical University of Berlin, Germany 2: AIVRU, University of Sheffield, UK 3: Institute of Ophthalmology, UCL, UK In the analysis of data recorded by optical imaging from intrinsic signals (measurement of changes of light reflectance from cortical tissue) the removal of noise and artifacts such as blood vessel patterns is a serious problem. Often bandpass filtering is used, but the underlying assumption that a spatial frequency exists, which separates the mapping component from other components (especially the global signal), is questionable. Here we propose alternative ways of processing optical imaging data, using blind source separation techniques based on the spatial decorrelation of the data. We first perform benchmarks on artificial data in order to select the way of processing, which is most robust with respect to sensor noise. We then apply it to recordings of optical imaging experiments from macaque primary visual cortex. We show that our BSS technique is able to extract ocular dominance and orientation preference maps from single condition stacks, for data, where standard post-processing procedures fail. Artifacts, especially blood vessel patterns, can often be completely removed from the maps. In summary, our method for blind source separation using extended spatial decorrelation is a superior technique for the analysis of optical recording data. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ------------------------------------------------------------------------------- Recurrent cortical competition: Strengthen or weaken? P. Adorjn, L. Schwabe, C. Piepenbrock, and K. Obermayer CS Department, Technical University of Berlin, Germany We investigate the short term dynamics of recurrent competition and neural activity in the primary visual cortex in terms of information processing and in the context of orientation selectivity. We propose that after stimulus onset, the strength of the recurrent excitation decreases due to fast synaptic depression. As a consequence, the network is shifted from an initially highly nonlinear to a more linear operating regime. Sharp orientation tuning is established in the first highly competitive phase. In the second and less competitive phase, precise signaling of multiple orientations and long range modulation, e.g., by intra- and inter-areal connections becomes possible (surround effects). Thus the network first extracts the salient features from the stimulus, and then starts to process the details. We show that this signal processing strategy is optimal if the neurons have limited bandwidth and their objective is to transmit the maximum amount of information in any time interval beginning with the stimulus onset. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ================================================================================ Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From Annette_Burton at Brown.edu Thu Sep 2 16:00:47 1999 From: Annette_Burton at Brown.edu (Annette Burton) Date: Thu, 2 Sep 1999 16:00:47 -0400 Subject: No subject Message-ID: A non-text attachment was scrubbed... Name: not available Type: multipart/alternative Size: 1891 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/7c07ccb7/attachment.bin From wiskott at itb.biologie.hu-berlin.de Tue Sep 7 05:42:57 1999 From: wiskott at itb.biologie.hu-berlin.de (Laurenz Wiskott) Date: Tue, 7 Sep 1999 11:42:57 +0200 Subject: Bibliographies Message-ID: <199909070942.LAA00972@monod.biologie.hu-berlin.de> Dear all, I have compiled several bibliographies on computational models and algorithms related to vision and neural networks, some of which might be of interest to you. They also contain many links to online-documents and author's homepages. You can access the bibliographies via my homepage http://itb.biologie.hu-berlin.de/~wiskott/homepage.html (or http://www.cnl.salk.edu/~wiskott/homepage.html). Any kind of (constructive) feedback is welcome. Best regards, Laurenz Wiskott. BIBLIOGRAPHIES The approximate number of references and the support level are given in brackets. Invariances in Neural Systems (175, mixed) Learning Invariances (51, medium - 1998) Cortical and Artificial Neural Maps (135, mixed) Receptive Field Development (4, low) Cortical Map Analysis (8, low) Cortical Map Formation (106, high - 1999) Artificial Neural Maps (15, low) Face Processing (196, mixed) Facial Feature Finding (30, low) Face Coding and Animation (59, low) Face Analysis (34, low) Face Recognition (95, medium - 1999) Dynamic Link Matching (13, medium - 1999) Visual Motion Processing (615, mixed) Depth from Stereo (71, low) Optical Flow Estimation (248, medium - 1998) Image Motion Analysis (262, low) Segmentation from Motion (106, medium - 1998) Visual Tracking (48, low) Video Coding (71, low) -- Laurenz Wiskott, Innovationskolleg Theoretische Biologie, Berlin http://itb.biologie.hu-berlin.de/~wiskott/ wiskott at itb.biologie.hu-berlin.de From fritz at neuro.informatik.uni-ulm.de Wed Sep 8 11:46:31 1999 From: fritz at neuro.informatik.uni-ulm.de (Fritz Sommer) Date: Wed, 8 Sep 99 17:46:31 +0200 Subject: Job opening in fMRI analysis/modeling Message-ID: <9909081546.AA13802@neuro.informatik.uni-ulm.de> Research Position (cognitive/computational neuroscience) A position (beginning Nov 1999, 2 years, 1 year extension possible) is available at the University of Ulm in an interdiscplinary research project on analysis and modeling of functional magnetic resonance data. In this joint project of the departments of Psychiatry, Radiology and Neural Information Processing a method of detection and interpretation of functional/effective connectivity in fMRI data will be developed and will be applied to working memory tasks. Candidates should have a background in statistical methods, functional MRI analysis or computational neuroscience. Required is a recent masters degree or equivalent in computer science, physics, mathematics or in a closely related area. The research can be conducted as part of a PhD thesis degree in Computer Science. Experience in programming in C in a Unix environment is necessary, experience with MATLAB and SPM is helpful. Salary according to BAT IIa. The University of Ulm is an equal opportunity employer and emphatically encourages female scientists to apply. Employment will be effected through the "Zentrale Universitaetsverwaltung" of the University of Ulm. Please send CV, letter of motivation and addresses of three referees to: Prof. Dr. Dr. M. Spitzer, Department of Psychiatry III, University of Ulm, Leimgrubenweg 12, 89075 Ulm, Germany or e-mail to manfred.spitzer at medizin.uni-ulm.de. For more detailed information on the research project please contact Dr. F. T. Sommer, email: fritz at neuro.informatik.uni-ulm.de From cindy at cns.bu.edu Wed Sep 8 10:10:45 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 8 Sep 1999 10:10:45 -0400 Subject: Neural Networks 12(6) Message-ID: <199909081410.KAA07821@retina.bu.edu> NEURAL NETWORKS 12(6) Contents - Volume 12, Number 6 - 1999 NEURAL NETWORKS LETTERS: Improving support vector machine classifiers by modifying kernel functions S. Amari and S. Wu ARTICLES: *** Neuroscience and Neuropsychology *** Self-organization of shift-invariant receptive fields Kunihiko Fukushima Faithful representations with topographic maps M.M. van Hulle A learning algorithm for oscillatory cellular neural networks C.Y. Ho and H. Kurokawa *** Mathematical and Computational Analysis *** Properties of learning of a Fuzzy ART variant M. Georgiopoulos, I. Dagher, G.L. Heileman, and G. Bebis Morphological bidirectional associative memories G.X. Ritter, J.L. Diaz-de-Leon, and P. Sussner The basins of attraction of a new Hopfield learning rule A.J. Storkey and R. Valabregue *** Technology and Applications *** Bayesian neural networks for classification: How useful is the evidence framework? W.D. Penny and S.J. Roberts Conformal self-organization for continuity on a feature map C.Y. Liou and W.-P. Tai Design of trellis coded vector quantizers using Kohonen maps Chi-Sing Leung and Lai-Wan Chan An information theoretic approach for combining neural network process models D.V. Sridhar, E.B. Bartlett, and R.C. Seagrave Inferential estimation of polymer quality using bootstrap aggregated neural networks J. Zhang ______________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 600 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.ida.his.se/ida/enns JNNS Membership c/o Professsor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ************************* From hadley at cs.sfu.ca Wed Sep 8 19:23:56 1999 From: hadley at cs.sfu.ca (Bob Hadley) Date: Wed, 8 Sep 1999 16:23:56 -0700 (PDT) Subject: Computational Power and Limits of ANNs Message-ID: <199909082323.QAA00295@css.css.sfu.ca> URL: www.cs.sfu.ca/~hadley/online.html ~~~~~~~~~ Paper Available ~~~~~~~~ COGNITION AND THE COMPUTATIONAL POWER OF CONNECTIONIST NETWORKS by Robert F. Hadley School of Computing Science and Cognitive Science Program Simon Fraser University Burnaby, B.C., V5A 1S6 Canada hadley at cs.sfu.ca ABSTRACT This paper examines certain claims of ``cognitive significance'' which (wisely or not) have been based upon the theoretical powers of three distinct classes of connectionist networks, namely, the ``universal function approximators'', recurrent finite-state simulation networks, and Turing equivalent networks. Each class will be considered with respect to its potential in the realm of cognitive modeling. Regarding the first class, I argue that, contrary to the claims of some influential connectionists, feed-forward networks do NOT possess the theoretical capacity to approximate all functions of interest to cognitive scientists. For example, they cannot approximate many important, recursive (halting) functions which map symbolic strings onto other symbolic strings. By contrast, I argue that a certain class of recurrent networks (i.e., those which closely approximate deterministic finite automata, DFA) shows considerably greater promise in some domains. However, from a cognitive standpoint, difficulties arise when we consider how the relevant recurrent networks could acquire the weight vectors needed to support DFA simulations. These difficulties are severe in the realm of central high-level cognitive functions. In addition, the class of Turing equivalent networks is here examined. It is argued that the relevance of such networks to cognitive modeling is seriously undermined by their reliance on infinite precision in crucial weights and/or node activations. I also examine what advantages these networks might conceivably possess over and above classical symbolic algorithms. For, from a cognitive standpoint, the Turing equivalent networks present difficulties very similar to certain classical algorithms; they appear highly contrived, their structure is fragile, and they exhibit little or no noise-tolerance. (21 Pages -- 1.5 spacing ) Available by web at: www.cs.sfu.ca/~hadley/online.html From barba at cvs.rochester.edu Thu Sep 9 13:42:19 1999 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Thu, 9 Sep 1999 13:42:19 -0400 Subject: Faculty Position Open Message-ID: Assistant Professor in Visual Neuroscience. The University of Rochester has available a tenure-track position for a neuroscientist working in the visual system. The successful candidate will have a primary appointment in the Department of Brain and Cognitive Sciences (http://www.bcs.rochester.edu ) and will be a member of the Center for Visual Science (http://www.cvs.rochester.edu), a strong, university-wide community of 27 faculty engaged in vision research. Applicants should submit a curriculum vitae, a brief statement of research and teaching interests, reprints and three reference letters to: David R. Williams, Director Center for Visual Science, University of Rochester, Rochester, NY 14627-0270. Application review begins December 1, 1999. The University of Rochester is an equal opportunity employer ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara N. Arnold Administrator email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 University of Rochester fax: 716 271 3043 Meliora Hall 274 Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From mgeorg at SGraphicsWS1.mpe.ntu.edu.sg Wed Sep 15 16:00:52 1999 From: mgeorg at SGraphicsWS1.mpe.ntu.edu.sg (Georg Thimm) Date: Fri, 10 Sep 1999 11:12:52 -12848 Subject: Events on Artificial Intelligence (moved to a new location!) Message-ID: <199909100312.LAA09802@SGraphicsWS1.mpe.ntu.edu.sg> ----------------------------------------- WWW page for Announcements of Conferences, Workshops and Other Events on Artificial Intelligence ----------------------------------------- This WWW page allows you to look up and enter announcements for conferences, workshops, and other events concerned with neural networks, inductive learning, genetic algorithms, data mining, agents, applications of AI, pattern recognition, vision, and related fields. ------------------------------------------------------------------------- Search and lookup can be restricted to events with forthcoming deadlines! Digests for events entered the last 2, 5, 10 or 30 days available! ------------------------------------------------------------------------- The frequently updated events lists, contains currently more than 130 forthcoming events, and can be accessed via the URL: http://www.drc.ntu.edu.sg/users/mgeorg/enter.epl The entries are ordered chronologically and presented in a format for fast and easy lookup of: - date and place of the events, - titles of the events, - contact addresses (surface mail, email, ftp, and WWW address, as well as telephone or fax number), and - deadlines for submissions, registration, etc. Conference organizers are kindly asked to enter their conference into the database: http://www.drc.ntu.edu.sg/users/mgeorg/NN-events.epl . The list is in parts published in the journal Neurocomputing by Elsevier Science B.V. Information on passed conferences are also available. Kind Regards, Georg Thimm P.S. You are welcome to distribute this announcement to related mailing lists. From hiro at ladyday.kyoto-su.ac.jp Fri Sep 10 02:18:47 1999 From: hiro at ladyday.kyoto-su.ac.jp (hiro) Date: Fri, 10 Sep 1999 15:18:47 +0900 (JST) Subject: JPSTH paper Message-ID: <199909100618.PAA00576@ladyday.kyoto-su.ac.jp> The following paper has been accepted for publication in Neural Computation and is available from http://www.kyoto-su.ac.jp/~hiro/jpsth_rev3.pdf ----------- Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Department of Information and Communication Sciences, Faculty of Engineering, Kyoto Sangyo University, Kita-ku, Kyoto 603-8555, Japan and CREST, Japan Science and Technology. ABSTRACT Multineuronal recordings have enabled us to examine context- dependent changes in the relationship between the activities of multiple cells. The Joint Peri-Stimulus Time Histogram (JPSTH) is a much-used method for investigating the dynamics of the interdependence of spike events between pairs of cells. Its results are often taken as an estimate of interaction strength between cells, independent of modulations in the cells' firing rates. We evaluate the adequacy of this estimate by examining the mathematical structure of how the JPSTH quantifies an interaction strength after excluding the contribution of firing rates. We introduce a simple probabilistic model of interacting point processes to generate simulated spike data, and show that the normalized JPSTH incorrectly infers the temporal structure of variations in the interaction parameter strength. This occurs because, in our model, the correct normalization of firing rate contributions is different to that used in Aertsen et al.'s ``effective connectivity" model. This demonstrates that firing rate modulations cannot be corrected for in a model-independent manner; and therefore the effective connectivity does not represent a universal characteristic that is independent of modulation of the firing rates. Aertsen et al.'s effective connectivity may still be used in the analysis of experimental data, provided we are aware that this is simply one of many ways of describing the structure of interdependence. We also discuss some measure-independent characteristics of the structure of interdependence. ------------ Regards. Hiroyuki Ito Dept. of Information & Communication Sci. Faculty of Engineerings Kyoto Sangyo University JAPAN e-mail: hiro at ics.kyoto-su.ac.jp From morten at compute.it.siu.edu Fri Sep 10 14:18:36 1999 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Fri, 10 Sep 1999 13:18:36 -0500 (CDT) Subject: Paper announcements Message-ID: The following two papers may be of interest to the readers of this list. Both papers involve connectionist modeling of psycholinguistic data. Christiansen, M.H. & Chater, N. (1999). Toward a connectionist model of recursion in human linguistic performance. Cognitive Science, 23, 157-205. Abstract Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center-embedded and cross-dependency, and between the processing of these complex recursive structures and right-branching recursive constructions. We analyze how these differences in performance are reflected in the internal representations of the model by performing discriminant analyses on these representation both before and after training. Furthermore, we show how a network trained to process recursive structures can also generate such structures in a probabilistic fashion. This work suggests a novel explanation of people's limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion. The paper was published in the current issue of Cognitive Science. A preprint version can be downloaded from: http://www-rcf.usc.edu/~mortenc/nn-rec.html ---------------------------------------------------------------------- Christiansen, M.H. & Curtin, S.L. (1999). The power of statistical learning: No need for algebraic rules. In The Proceedings of the 21st Annual Conference of the Cognitive Science Society (pp. 114-119). Mahwah, NJ: Lawrence Erlbaum Associates. Abstract Traditionally, it has been assumed that rules are necessary to explain language acquisition. Recently, Marcus, Vijayan, Rao & Vishton (1999) have provided behavioral evidence which they claim can only be explained by invoking algebraic rules. In the first part of this paper, we show that contrary to these claims an existing simple recurrent network model of word segmentation can fit the relevant data without invoking any rules. Importantly, the model closely replicates the experimental conditions, and no changes are made to the model to accommodate the data. The second part provides a corpus analysis inspired by this model, demonstrating that lexical stress changes the basic representational landscape over which statistical learning takes place. This change makes the task of word segmentation easier for statistical learning models, and further obviates the need for lexical stress rules to explain the bias towards trochaic stress patterns in English. Together the connectionist simulations and the corpus analysis show that statistical learning devices are sufficiently powerful to eliminate the need for rules in an important part of language acquisition. The paper was published in the most recent Cognitive Science Society proceedings. An HTML version of the paper can be viewed at: http://www.siu.edu/~psycho/faculty/morten/statlearn.html And a hardcopy can be downloaded from: http://www-rcf.usc.edu/~mortenc/no-rules.html Best regards, Morten Christiansen PS: Apologies if you receive two copies of this message. ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A URL: http://www.siu.edu/~psycho/faculty/mhc.html ---------------------------------------------------------------------- From chella at unipa.it Fri Sep 10 13:33:30 1999 From: chella at unipa.it (Antonio Chella) Date: Fri, 10 Sep 1999 19:33:30 +0200 Subject: INTERNATIONAL SCHOOL ON NEURAL NETS <> Message-ID: <37D940E8.CA4E081A@unipa.it> [We apologize for multiple copies] INTERNATIONAL SCHOOL ON NEURAL NETS <> 4th Course: Subsymbolic Computation in Artificial Intelligence ERICE-SICILY: October 24-31, 1999 Motivations Autonomous intelligent agents that perform complex real world tasks must be able to build and process rich internal representations that allow them to effectively draw inferences, make decisions, and, in general, perform reasoning processes concerning their own tasks. Within the computational framework of artificial intelligence (AI) this problem has been faced in different ways. According to the classical, symbolic approach, internal representations are conceived in terms of linguistic structures, as expressions of a "language of thought". Other traditions developed approaches that are less linguistically oriented, and more biologically and anatomically motivated. It is the case of neural networks, and of self-organizing and evolutionary algorithms. Empirical results concerning natural intelligent systems suggest that such approaches are not fully incompatible, and that different kinds of representation may interact. Similarly, it can be argued that the design of artificial intelligent systems can take advantage from different kinds of interacting representations, that are suited for different tasks. In this perspective, theoretical frameworks and methodological techniques are needed, that allow to employ together in a principled way different kinds of representation. In particular, autonomous agents need to find the meaning for the symbols they use within their internal processes and in the interaction with the external world, thus overcoming the well-known symbol grounding problem. An information processing architecture for autonomous intelligent agents should exhibit processes that act on suitable intermediate levels, which are intermediary among sensory data, symbolic level, and actions. These processes could be defined in terms of subsymbolic computation paradigms, such as neural networks, self-organizing, and evolutionary algorithms. DIRECTOR OF THE COURSE: Salvatore Gaglio DIRECTOR OF THE SCHOOL: M.I.,Jordan ? M. Marinaro DIRECTOR OF THE CENTRE: A. Zichichi SCIENTIFIC SECRETARIAT: Edoardo Ardizzone, Antonio Chella, Marcello Frixione WEB PAGE OF THE SCHOOL: http://www.cere.pa.cnr.it/ScuolaErice/ ================== PROGRAM FOUNDATIONS Introduction to Artificial Intelligence L. CARLUCCI AIELLO, University of Rome "La Sapienza", Rome, Italy Neural modelling of higher order cognitive processes J. TAYLOR, King's College, London, UK Connections Models for Data Structures M. GORI, University of Siena, Siena, Italy Neural Systems Engineering I. ALEXANDER, Imperial College, London, UK ASEIT (Advanced School on Electronics and Information Technology) OPEN INTERNATIONAL WORKSHOP ON SUBSYMBOLIC TECHNIQUES AND ALGORITHMS P. GARDENFORS, Lund University, Sweden I. ALEXANDER, Imperial College, London, UK T. KOHONEN, Helsinki University of Technology, Finland J. TAYLOR, King's College, London, UK R. ARKIN, Georgia Institute of Technology, USA REPRESENTATION Conceptual Spaces P.GARDENFORS, Lund University, Sweden Topological Self Organizing Maps T. KOHONEN, Helsinki University of Technology, Finland Symbolic Representation L. CARLUCCI AIELLO, University of Rome "La Sapienza", Rome, Italy VISUAL PERCEPTION Evolutionary Processes for Artificial Perception G. ADORNI, University of Parma, Italy S. CAGNONI, University of Parma, Italy Cognitive Architectures for Artificial Intelligence M. FRIXIONE, University of Salerno S. GAGLIO, University of Palermo Algorithms for computer vision V. DI GESU', University of Palermo, Italy ACTION Motion Maps P. MORASSO, University of Genoa, Italy The self-organisation of grounded languages on autonomous robots L. STEELS, Free University of Brussels, Belgium Reinforcement Learning in Autonomous Robots C. BALKENIUS, Lund University, Sweden Behaviour-Based Robotics R. ARKIN, Georgia Institute of Technology, USA ================== APPLICATIONS Interested candidates should send a letter to the Director of the Course: Professor Salvatore GAGLIO Dipartimento di Ingegneria Automatica e Informatica Universita' di Palermo Viale delle Scienze 90128 - PALERMO - ITALY E-mail: gaglio at unipa.it They should specify: 1.date and place of birth, together with present nationality; 2.affiliation; 3.address, e-mail address. Please enclose a letter of recommendation from the group leader or the Director of the Institute or from a senior scientist. PLEASE NOTE Participants must arrive in Erice on October 24, not later than 5:00 pm. IMPORTANT The total fee, which includes full board and lodging (arranged by the School), is EURO 1000 (about 1000 USD). Thanks to the generosity of the sponsoring institutions, partial or full support can be granted to some deserving students who need financial aid. Requests to this effect must be specified and justified in the letter of application. Closing date for application: September 20, 1999 A limited number of places is available. Admission to the Workshop will be decided in consultation with the Advisory Committee of the School composed of Professors S. Gaglio, M. Marinaro, and A. Zichichi. An area for some contributed poster presentations will be available. These will be selected on the bases of an abstract of two A4 pages to be sent to the Director of the Course before September 20, 1999. From fet at socrates.berkeley.edu Fri Sep 10 17:18:05 1999 From: fet at socrates.berkeley.edu (Frederic Edouard Theunissen) Date: Fri, 10 Sep 1999 14:18:05 -0700 Subject: Postion available in quantitative Psychology Message-ID: <007c01befbd1$f8de2fc0$e0f32080@hinault.Psych.Berkeley.EDU> ******************************************************************** UNIVERSITY OF CALIFORNIA AT BERKELEY: The Department of Psychology invites applications at any level for two tenured/tenure-track positions beginning July 1, 2000. We are interested in two areas: (1) quantitative psychology (including, but not limited to, multivariate analysis, measurement, mathematical modeling, and computer modeling), and (2) social/personality psychology. Applications for the position must be postmarked by October 1, 1999, and are to include a curriculum vitae, a description of research interests and selected reprints sent to: Search Committee, Department of Psychology, 3210 Tolman Hall #1650, University of California, Berkeley, CA 94720-1650. Candidates should also arrange to have at least three letters of recommendation sent to the same address by the application date. Candidates are asked to specify the position for which they are applying, and to submit an application for each position should they wish to be considered for both. Applications postmarked after the deadline will not be considered. The University of California is an Equal Opportunity/Affirmative Action Employer. From cweber at cs.tu-berlin.de Mon Sep 13 11:40:32 1999 From: cweber at cs.tu-berlin.de (Cornelius Weber) Date: Mon, 13 Sep 1999 17:40:32 +0200 (MET DST) Subject: Paper available Message-ID: The following ICANN'99 conference paper is now available on-line. Orientation Selective Cells Emerge in a Sparsely Coding Boltzmann Machine Abstract: In our contribution we investigate a sparse coded Boltzmann machine as a model for the formation of orientation selective receptive fields in primary visual cortex. The model consists of two layers of neurons which are recurrently connected and which represent the lateral geniculate nucleus and primary visual cortex. Neurons have ternary activity values +1, -1, and 0, where the 0-state is degenerate being assumed with higher prior probability. The probability for a (stochastic) activation vector on the net obeys the Boltzmann distribution and maximum-likelihood leads to the standard Boltzmann learning rule. We apply a mean-field version of this model to natural image processing and find that neurons develop localized and oriented receptive fields. http://www.cs.tu-berlin.de/~cweber/publications/99sparseBM.ps 6 pages, 180 KB. From ken at phy.ucsf.EDU Tue Sep 14 04:18:14 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Tue, 14 Sep 1999 01:18:14 -0700 (PDT) Subject: CSH/Stony Brook Fellowships: Interdisciplinary Research in Brain Theory Message-ID: <14302.1222.260286.182929@coltrane.ucsf.edu> The Cold Spring Harbor Laboratory and The State University of New York at Stony Brook: SICN Fellowships for Interdisciplinary Research in Neuroscience The Swartz Initiative for Computational Neuroscience (SICN) announces a program to promote collaborative studies between researchers in computational neuroscience, mathematics, physics, engineering and computer sciences. The goal is to understand the algorithms and implementations that underlie brain functions. To this end, SICN intends to foster the growth of interdisciplinary research in brain theory. SUNY at Stony Brook has strong departments in neurobiology, mathematics, physical sciences and technology. CSHL had a strong program in cellular neurobiology and has made a growing commitment in theoretical neuroscience. Scientists will be be hired at the post-doctoral level to work in association with faculty or, if deemed appropriate, independently. The salary will be highly competitive and those selected will be eligible for cintinuing support from SICN. Candidates should submit (to Jonathan Wallach at the address below) a one-page summary of their research interests and goals, a CV, and the names of three academic references. For full consideration, applications should be received by October 15th, 1999. For further information about these positions please contact: Jonathan Wallach, Director The Swartz Foundation 535 Life Sciences SUNY at Stony Brook Stony Brook, NY 11794-5230, email: wallach at swartzneuro.org, tel: 516 632 4179. Visit the Swartz Foundation web site at www.swartzneuro.org for complete information about this program. From ken at phy.ucsf.EDU Tue Sep 14 04:34:39 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Tue, 14 Sep 1999 01:34:39 -0700 (PDT) Subject: Paper available: Subregion Correspondence Model of Binocular Simple Cells Message-ID: <14302.2207.469424.704224@coltrane.ucsf.edu> The following paper is now available at ftp://ftp.keck.ucsf.edu/pub/ken/dispar.ps.gz (compressed postscript) ftp://ftp.keck.ucsf.edu/pub/ken/dispar.ps (postscript) http://www.keck.ucsf.edu/~ken (click on 'Publications') This is a preprint of an article that appeared as Journal of Neuroscience 19:7212-7229 (1999): http://www.jneurosci.org/cgi/content/abstract/19/16/7212 ------------------------------ The Subregion Correspondence Model of Binocular Simple Cells Ed Erwin and Kenneth D. Miller Dept. of Physiology, UCSF ABSTRACT: We explore the hypothesis that binocular simple cells in cat areas 17 and 18 show subregion correspondence, defined as follows: within the region of overlap of the two eye's receptive fields, their ON subregions lie in corresponding locations, as do their OFF subregions. This hypothesis is motivated by a developmental model (Erwin and Miller, 1998) that suggested that simple cells could develop binocularly matched preferred orientations and spatial frequencies by developing subregion correspondence. Binocular organization of simple cell receptive fields is commonly characterized by two quantities: interocular position shift, the distance in visual space between the center positions of the two eye's receptive fields; and interocular phase shift, the difference in the spatial phases of those receptive fields, each measured relative to its center position. The subregion correspondence hypothesis implies that interocular position and phase shifts are linearly related. We compare this hypothesis with the null hypothesis, assumed by most previous models of binocular organization, that the two types of shift are uncorrelated. We demonstrate that the subregion correspondence and null hypotheses are equally consistent with previous measurements of binocular response properties of individual simple cells in the cat and other species, and with measurements of the distribution of interocular phase shifts vs. preferred orientations or vs. interocular position shifts. However, the observed tendency of binocular simple cells in the cat to have ``tuned excitatory'' disparity tuning curves with preferred disparities tightly clustered around zero (Fischer and Kruger, 1979; Ferster, 1981; LeVay and Voigt, 1988) follows naturally from the subregion correspondence hypothesis, but is inconsistent with the null hypothesis. We describe tests that could more conclusively differentiate between the hypotheses. The most straightforward test requires simultaneous determination of the receptive fields of groups of 3 or more binocular simple cells. -------------------------------- Ken Kenneth D. Miller telephone: (415) 476-8217 Dept. of Physiology fax: (415) 476-4929 UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From protopap at panteion.gr Tue Sep 14 14:36:20 1999 From: protopap at panteion.gr (Athanassios Protopapas) Date: Tue, 14 Sep 1999 21:36:20 +0300 (EET DST) Subject: Paper announcement: Conn Speech Perception Message-ID: Dear colleagues, I would like to bring to your attention my recent review paper titled "connectionist modeling of speech perception" published in Psychological Bulletin 125(4):410-436. You may find it interesting and perhaps useful for a graduate level course as it attempts to bring together the connectionist and speech literature requiring no substantial prior understanding in either one beyong a general psychology background. If you do not have access to a personal or library subscription to Psychological Bulletin you may contact me for a photocopy of the article. The abstract is: Connectionist models of perception and cognition, including the process of deducing meaningful messages from patterns of acoustic waves emitted by vocal tracts, are developed and refined as our understanding of brain function, psychological processes, and the properties of massively parallel architectures advances. In the present article, several important contributions from diverse points of view in the area of connectionist modeling of speech perception are presented and their relative merits discussed with respect to specific theoretical issues and empirical findings. TRACE, the Elman/Norris net, and Adaptive Resonance Theory constitute pivotal points exemplifying overall modeling success, progress in temporal representation, and plausible modeling of learning, respectively. Other modeling efforts are presented for the specific insights they offer and the article concludes with a discussion of computational versus dynamic modeling of phonological processes. Your comments will also be greatly appreciated. Thanassi Protopapas -- Athanassios Protopapas, PhD Department of Educational Technology Phone: +30 1 680 0959 Institute for Language and Speech Processing Fax: +30 1 685 4270 Epidavrou & Artemidos 6, Marousi e-mail: protopap at ilsp.gr GR-151 25 ATHENS, Greece From hiro at ladyday.kyoto-su.ac.jp Wed Sep 15 00:06:16 1999 From: hiro at ladyday.kyoto-su.ac.jp (hiro@ladyday.kyoto-su.ac.jp) Date: Wed, 15 Sep 1999 13:06:16 +0900 Subject: jpsth paper (updated) Message-ID: <199909150406.NAA08066@ladyday.kyoto-su.ac.jp> Dear Connectionists: Sorry for disturbing you again. But I got mails informing of FONTS problems in reading my preprint. I updated the file so that everyone can read the file. I am sorry for such an inconvenience. Hiroyuki Ito Faculty of Engineering, Kyoto Sangyo University, Kyoto hiro at ics.kyoto-su.ac.jp --- my previous message ---- The following paper has been accepted for publication in Neural Computation and is available from http://www.kyoto-su.ac.jp/~hiro/jpsth_rev3.pdf Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Department of Information and Communication Sciences, Faculty of Engineering, Kyoto Sangyo University, Kita-ku, Kyoto 603-8555, Japan and CREST, Japan Science and Technology From mclennan at cs.utk.edu Wed Sep 15 16:51:04 1999 From: mclennan at cs.utk.edu (Bruce MacLennan) Date: Wed, 15 Sep 1999 16:51:04 -0400 Subject: CFP/field computation Message-ID: <199909152051.QAA07609@maclennan.cs.utk.edu> Dear Colleagues: As program co-chair of 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE (Atlantic City, February 27 - March 3, 2000), I am organizing a special session on FIELD COMPUTATION (continuum-limit neural computation). (The general conference Call for Papers is attached.) A number of groups are now working in this area, and the time is ripe to gather in one place and compare results. If you have been working in this area, I hope you will consider submitting a paper to this session. Although the conference CFP lists Sep 1 as the deadline for receipt of summaries, we will continue to receive them for this special session through Sep 30. However, if you are interested in participating but cannot meet this deadline, please let me know and I'll see what we can arrange. Best wishes and thank you, Bruce MacLennan Department of Computer Science The University of Tennessee Knoxville, TN 37996-1301 PHONE: (423)974-5067 FAX: (423)974-4404 EMAIL: maclennan at cs.utk.edu URL: http://www.cs.utk.edu/~mclennan [sic] ===================================================================== Call for Papers 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Trump Taj Mahal Casino and Resort]], Atlantic City, NJ USA February 27 -- March 3, 2000 Summary Submission Deadline: September 1, 1999 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. http://www.ee.duke.edu/JCIS/ ***Added plenary speakers***: Marvin Minsky and Brian Josephson Plenary Speakers include the following: +------------------------------------------------------------------------+ |James Anderson |Wolfgang Banzhaf |B. Chandrasekaran|Lawrence J. Fogel| |-----------------+------------------+-----------------+-----------------| |Walter J. Freeman|David E. Goldberg |Irwin Goodman |Stephen Grossberg| |-----------------+------------------+-----------------+-----------------| |Thomas S.Huang |Janusz Kacprzyk |A. C. Kak |Subhash C. Kak | |-----------------+------------------+-----------------+-----------------| |John Mordeson |Kumpati S. Narenda|Anil Nerode |Huang T. Nguyen | |-----------------+------------------+-----------------+-----------------| |Jeffrey P. Sutton|Ron Yager | | | +------------------------------------------------------------------------+ Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o DNA Computing o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary Submission Deadline: September 1, 1999 Notification of authors upon review: November 1, 1999 December 1, 1999 - Deadline for invited sessions and exhibition proposals Papers will be accepted based on summaries. A summary shall not exceed 4 pages of 10-point font, double-column, single-spaced text, with figures and tables included. For the Fourth ICCIN, send 3 copies of summaries to: George M. Georgiou Computer Science Department California State University San Bernardino, CA 92407-2397 U.S.A. georgiou at csci.csusb.edu From Paolo.Gaudiano at artificial-life.com Wed Sep 15 12:46:21 1999 From: Paolo.Gaudiano at artificial-life.com (Paolo Gaudiano) Date: Wed, 15 Sep 1999 12:46:21 -0400 Subject: FINAL CALL FOR PARTICIPATION in BOSTON, October 3-6 Message-ID: FINAL ANNOUNCEMENT: UI-CANCS'99 to take place in Boston on October 3-6. [You will likely receive multiple copies of this. That's because this is such a great conference at such a low price that we think everyone should know about it before it's too late :-). Sorry about the clutter.] What: USA-Italy Conference on Applied Neural and Cognitive Sciences When: October 3-6, 1999 Where: Boston University Sherman Union, 775 Commonwealth Ave, Boston. Web: www.usa-italy.org Come hear outstanding members of industry and academia discuss state-of-the-art research and applications in intelligent agents, robotics, smart sensors, artificial intelligence, biomedical engineering and other cutting-edge technologies. Thanks to the generous support of Artificial Life, Inc. and of the Italian Ministry of Foreign Affairs, we are able to offer extremely low registration rates---even including meals and a tour of local area labs (space limited). Please visit our web site to see the exciting line-up we have planned and for additional details. If you have any questions please send e-mail to . From smagt at dlr.de Thu Sep 16 04:25:00 1999 From: smagt at dlr.de (Patrick van der Smagt) Date: Thu, 16 Sep 1999 10:25:00 +0200 (MET DST) Subject: 3 PhD job openings at DLR, Oberpfaffenhofen, Germany Message-ID: <199909160825.KAA03547@ilz.robotic> The Robotics neuro-group at the German Aerospace Center in Oberpfaffenhofen, Germany, has three Ph.D. position openings on the following subjects: * Learning changing environments in large neural networks * Multiresolution neural networks for fast and accurate learning * Active vision using statistical inference neural networks Visit http://www.robotic.dlr.de/LEARNING/jobs/ for more information. Condensed versions of these job descriptions follow: ---------------------------------------------------------------------- Learning changing environments in large neural networks Goal of this research project is the exploration and development of learning methodologies (optimization) for large-scale neural networks. Secondly, the effect of incremental learning in such networks is to be investigated. Close cooperation with the vision and DLR Four-Finger-Hand groups in our department are important in the development and application of the methodologies. We are looking for a candidate to join our currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. The desired candidate has a strong background in mathematics as well as statistics, and knows her way around programming in C++ and Mathematica. ---------------------------------------------------------------------- Multiresolution neural networks for fast and accurate learning Goal of this Ph.D. research is the further development of a multiresolution approximation method in the application of on-line learning of high-dimensional data, using MARS as well as the Nested Network as a starting-point. An important issue is the applicability to high-dimensional problems. The theoretical and implementational aspects of this projects are of equal importance; as a result, the algorithm should be implemented as a real-time shape-from-shading task, as dynamics control in complex hand-eye coordination tasks, and for learning data resulting from grasping tasks using the DLR Four-Finger-Hand, which is developed and available in the DLR Robotics group. For pursuing research in this area we are looking for an oudstanding qualifying candidate to join or currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. The research project has a major computer science component; therefore, the applicant ideally comes from a CS background, while having a strong foothold in statistics. Knowledge of function approximation with neural networks are a plus. Prerequisite is comfortability with a UNIX environment and knowledge of C++. ---------------------------------------------------------------------- Active Vision using statistical inference neural networks The goal of this Ph.D. research is to develop a methodology based on statistical inference neural networks, which can be used in active vision control. The theoretical and implementational aspects are of equal importance; in the end, a system should result which can be used on the DLR Four-Finger-Hand in combination with the DLR Light-Weight robot, both of which are developed and available in the DLR Robotics Department. The resulting system should eventually cooperate with a grasping methodology which is currently in development. For pursuing research in this exciting field we are looking for an outstanding candidate to join our currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. In-depth knowledge and practice in applied mathematics and strong programming skills are mandatory. Experience is expected in at least one of the following areas: * statistical modeling, statistical inference * data mining * artificial neural networks/machine learning * image processing ---------------------------------------------------------------------- Please direct all application material (CV/resume, xeroxed diplomas, letters of reference, where available reprints of articles) or further questions to Dr. Patrick van der Smagt Institute of Robotics and Systems Dynamics DLR Oberpfaffenhofen D-82334 Wessling Phone +49 8153 281152 Fax +49 8153 281134 Email smagt at dlr.de Start of appointments: immediate Salary: conform half BAT IIa Each of the appointments are limited for a period of 3 years. Standard EU regulations apply. -- Dr Patrick van der Smagt phone +49 8153 281152, fax -34 DLR/Institute of Robotics and System Dynamics smagt at dlr.de P.O.Box 1116, 82230 Wessling, Germany http://www.robotic.de/Smagt/ From aapo at james.hut.fi Thu Sep 16 10:17:52 1999 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Thu, 16 Sep 1999 17:17:52 +0300 (EEST) Subject: papers on ICA Message-ID: <199909161417.RAA34929@james.hut.fi> Dear Connectionists, the following papers on extensions of ICA can now be found on my web page. - Aapo Hyvarinen http://www.cis.hut.fi/~aapo/ --------------------------------------------------------------------- A. Hyvarinen and P. Hoyer. Topographic Independent Component Analysis. http://www.cis.hut.fi/~aapo/ps/gz/TICA.ps.gz Abstract: In ordinary independent component analysis, the components are assumed to be completely independent, and they do not necessarily have any meaningful order relationships. In practice, however, the estimated ``independent'' components are often not at all independent. We propose that this residual dependence structure could be used to define a topographic order for the components. In particular, a distance between two components could be defined using their higher-order correlations, and this distance could be used to create a topographic representation. Thus we obtain a linear decomposition into approximately independent components, where the dependence of two components is approximated by the proximity of the components in the topographic representation. --------------------------------------------------------------------- A. Hyvarinen and P. Hoyer. Emergence of phase and shift invariant features by decomposition of natural images into independent feature subspaces. (to appear in Neural Computation) http://www.cis.hut.fi/~aapo/ps/gz/NC99_complex.ps.gz Olshausen and Field (1996) applied the principle of independence maximization by sparse coding to extract features from natural images. This leads to the emergence of oriented linear filters that have simultaneous localization in space and in frequency, thus resembling Gabor functions and simple cell receptive fields. In this paper, we show that the same principle of independence maximization can explain the emergence of phase and shift invariant features, similar to those found in complex cells. This new kind of emergence is obtained by maximizing the independence between norms of projections on linear subspaces (instead of the independence of simple linear filter outputs). The norms of the projections on such `independent feature subspaces' then indicate the values of invariant features. ---------------------------------------------------------------------- (Some other new papers on ICA can be found on my publication page http://www.cis.hut.fi/~aapo/pub.html as well.) From zhaoping at gatsby.ucl.ac.uk Fri Sep 17 04:55:53 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Fri, 17 Sep 1999 09:55:53 +0100 (BST) Subject: Paper available on a model of visual search Message-ID: Title: Contextual influences in V1 as a basis for pop out and asymmetry in visual search Author: Zhaoping Li Published in Proc Natl Acad Sci, USA Volumn 96, 1999. Page 10530-10535 Available at http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html or at http://www.pnas.org/content/vol96/issue18/#PSYCHOLOGY-BS Abstract: I use a model to show how simple, bottom-up, neural mechanisms in primary visual cortex can qualitatively explain the preattentive component of complex psychophysical phenomena of visual search for a target among distracters. Depending on the image features, the speed of search ranges from fast, when a target pops-out or is instantaneously detectable, to very slow, and it can be asymmetric with respect to switches between the target and distracter objects. It has been unclear which neural mechanisms or even cortical areas control the ease of search, and no physiological correlate has been found for search asymmetry. My model suggests that contextual influences in V1 play a significant role. From mdorigo at ulb.ac.be Fri Sep 17 03:21:31 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Fri, 17 Sep 1999 09:21:31 +0200 Subject: New book on Swarm Intelligence In-Reply-To: <2712.937551357@skinner.boltz.cs.cmu.edu> Message-ID: Swarm Intelligence: From Natural to Artificial Systems. Bonabeau E., M. Dorigo & G. Theraulaz (1999). New York: Oxford University Press. The book "Swarm Intelligence" provides a detailed look at models of social insect behavior and how to apply these models in the design of complex systems. In the book it is shown how these models replace an emphasis on control, preprogramming, and centralization with designs featuring autonomy, emergence, and distributed functioning. These designs are proving flexible and robust, able to adapt quickly to changing environments and to continue functioning even when individual elements fail. In particular, these designs are a novel approach to the tremendous growth of complexity in software and information. The book draws on up-to-date research from biology, neuroscience, artificial intelligence, robotics, operations research, and computer graphics, and each chapter is organized around a particular biological example, which is then used to develop an algorithm, a mutiagent systems, or a group of robots. Contents : 1. Introduction 2. Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communication Networks 3. Division of Labor and Task Allocation 4. Cemetery Organization, Brood Sorting, Data Analysis, and Graph Partitioning 5. Self-Organization and Templates: Application to Data Analysis and Graph Partitioning 6. Nest Building and Self-Assembling 7. Cooperative Transport by Insects and Robots 8. Epilogue Information on how to order the book is available at: http://www.oup-usa.org/ From allan at nagoya.riken.go.jp Fri Sep 17 11:55:17 1999 From: allan at nagoya.riken.go.jp (Allan Kardec Barros) Date: Sat, 18 Sep 1999 00:55:17 +0900 (JST) Subject: Matlab Package on ICA Message-ID: <199909171555.AAA22401@mail.bmc.riken.go.jp> A non-text attachment was scrubbed... Name: not available Type: text Size: 687 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/17cde9cd/attachment.ksh From rsun at cecs.missouri.edu Fri Sep 17 10:16:37 1999 From: rsun at cecs.missouri.edu (Ron Sun) Date: Fri, 17 Sep 1999 09:16:37 -0500 Subject: IJCNN'2000 Call for Papers and Participation Message-ID: <199909171416.JAA23416@pc113.cecs.missouri.edu> ======================================================================== Call For Papers *** IJCNN-2000 *** IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS to be held in Grand Hotel di Como, Como, Italy -- July 24-27, 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society, and with the technical cooperation of the Japanese Neural Network Society, AEI (the Italian Association of Electrical and Electronic Engineers), SIREN (the Italian Association of Neural Networks), and AI*IA (the Italian Association for Artificial Intelligence) Submission deadline is 15 DECEMBER 1999. Full papers in the final form must be submitted (accepted papers will be published as submitted). Papers will be reviewed by senior researchers in the field. Acceptance/rejection will be emailed by 30 March 2000. Accepted papers will be published only if the registration form and payment for at least one of the authors are received by 30 April 2000 (see the complete call for papers for details). For the complete Call for Papers and other information (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html (The organizers may be contacted by email at ijcnn2000 at elet.polimi.it.) Important NOTICE: In the year 2000, the International Conference on Artificial Neural Networks (IJCANN), organized annually by ENNS, will not take place because it has been incorporated into IJCNN'2000. ====================================================================== Publicity Chair for IJCNN'2000: Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department phone: (573) 884-7662 University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu From harnad at coglit.ecs.soton.ac.uk Sat Sep 18 15:05:50 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sat, 18 Sep 1999 20:05:50 +0100 (BST) Subject: PSYC Call for Book Reviewers: Neuropsychology of Lashley & Hebb Message-ID: PSYCOLOQUY CALL FOR BOOK REVIEWERS Below is the Precis of "The Neuropsychological Theories of Lashley and Hebb" by Jack Orbach (427 lines). This book has been selected for multiple review in PSYCOLOQUY. If you wish to submit a formal book review please write to psyc at pucc.princeton.edu indicating what expertise you would bring to bear on reviewing the book if you were selected to review it. (If you have never reviewed for PSYCOLOQUY or Behavioral & Brain Sciences before, it would be helpful if you could also append a copy of your CV to your inquiry.) If you are selected as one of the reviewers and do not have a copy of the book, you will be sent a copy of the book directly by the publisher (please let us know if you have a copy already). Reviews may also be submitted without invitation, but all reviews will be refereed. The author will reply to all accepted reviews. Full Psycoloquy book review instructions at: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psycoloquy/ Relevant excerpts: Psycoloquy reviews are of the book not the Precis. Length should be about 200 lines [c. 1800 words], with a short abstract (about 50 words), an indexable title, and reviewer's full name and institutional address, email and Home Page URL. All references that are electronically accessible should also have URLs. AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY My rationale for seeking open peer commentary is primarily that the book says some things about both Lashley and Hebb that some peers might find controversial and startling if not downright outrageous. To get these views out in the open may be of pedagogical value not only to to me but to the neuropsychological community at large. Obviously, I don't believe that my arguments are wrong or weak. But the feedback I get might conceivably persuade me to rethink the matter. psycoloquy.99.10.029.lashley-hebb.1.orbach Sat Sep 18 1999 ISSN 1055-0143 (16 paragraphs, 16 references, 427 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1999 Jack Orbach Precis of: THE NEUROPSYCHOLOGICAL THEORIES OF LASHLEY AND HEBB Precis of Orbach on Lashley-Hebb [University Press of America, 1998 xiv, 395 pp. ISBN: 0-761-81165-6] Jack Orbach Department of Psychology Queens College Flushing, NY U.S.A. jorbach at worldnet.att.net ABSTRACT: Beginning in the 1920s, K. S. Lashley startled psychologists with his theories of the memory trace within the cerebral cortex. Using terms such as terms mass action, equipotentiality, and sensory/motor equivalence, Lashley presented evidence that the engram is widely distributed in the brain, and that unactivated synapses, like activated ones, seem to show evidence of learning. His research and nativistic theories made him world famous by 1929, when he was just 39. He spent his professional career searching for a mechanism for the reduplication of the engram. While his contemporaries tried to specify the locus of the engram in the brain, Lashley found it everywhere. He liked to quip that the problem is not to find where the trace is located, but where it is not. Lashley's student, D. O. Hebb, published his empiricistic theories in 1949, in "The Organization of Behavior," and the monograph created a sensation. Hebb used Lorente de No's reverberatory circuit to provide a mechanism to maintain activity in the cerebral cortex after the stimulus terminated, the so-called central autonomous process. This led him to the cell assembly, a complex reverberatory circuit that could be assembled by experience. Changes in resistance at the synapse with learning came to be called the Hebb synapse. That monograph was highly praised for the breadth of its treatment. The present book documents how Lashley anticipated Hebb's introduction of the reverberatory circuit by some 12 years. Lashley's Vanuxem Lectures of 1952 are printed for the first time, together with nine of his previously published theoretical papers. Lashley's and Hebb's theories are reviewed and reevaluated fifty years after publication of Hebb's monograph, and a systematic effort is made to compare and contrast the views of teacher and student. KEYWORDS: cell assembly, central autonomous process, engram, equipotentiality, Hebb, Hebbian learning, Lashley, localization, memory trace, nativism, reverberatory circuit, Vanuxem Lectures 1. Part 1 of the book opens with a summary of Lashley's last public lecture given at the University of Rochester in 1957, one year before his death and eight years after the publication of Hebb's monograph. In this lecture, Lashley was still consumed with the notion of irradiating waves of excitation in the cerebral cortex, a notion he developed in detail in 1942. In citing theories of stimulus equivalence, Lashley wrote 'That of Hebb is most in accord with conditioned reflex theory. He assumes that multiple paths are developed by learning. Such learning is ruled out by a mass of evidence for innate discriminations and equivalencies.' In this unpublished address, Lashley cited Hebb's empiricistic theory for the first and only time. He never cited the monograph itself in the literature. 2. An early chapter entitled 'Setting the Stage' offers another look at Lashley's early critique of the native Watsonian connectionism of his day. Lashley's early efforts to revise and revitalize neuropsychological theory are reviewed. The problem, Lashley suggested in the 1920s, was the omission of the brain from the Watsonian S-R formula. And when a model of cortical function was finally introduced, using the analogy of the telephone switchboard, it was based on the idea of linear reflex activity in the spinal cord, as suggested by Dewey, leaving no room for psychological categories that require sustained activity in the brain such as thought, memory, emotion, motivation, selective attention and the like. And then, along came Pavlov who undercut all contemporary speculations of psychologists with his physiological theories of conditioned reflexes and brain function. It was at this point that Lashley burst upon the scene. 3. Hebb must have experienced an epiphany when he was introduced to the reverberatory circuit of Lorente de N. He realized that this anatomical curiosity provided him with a mechanism for the autonomous central process that he developed so masterfully in the 1949 monograph. Hebb's revelation involving the reverberatory circuit was especially important for it gave neurological meaning to the earlier proposals of central motive state of Morgan and central excitatory mechanism of Beach as well as the putative reduplicated memory trace of Lashley. However, Lashley had already appropriated the reverberatory circuit for neuropsychological theory in 1937, some 12 years before Hebb's monograph was published and some three years before its presentation by Hilgard and Marquis in their Conditioning and Learning of 1940. This is documented with excerpts from Lashley's papers published in 1937, 1938, 1941, 1942 and 1949. The latter two papers are republished in their entirety in this volume. 4. The next chapter deals with the learning theory that synaptic resistance is reduced by the passage of the nerve impulse. Lashley's 1924 assault on this theory is reviewed in detail. (This 1924 paper is also reprinted in this volume.) In Lashley's own words, 'Among the many unsubstantiated beliefs concerning the physiology of the learning process, none is more widely prevalent than the doctrine that the passage of the nerve impulse through the synapse somehow reduces synaptic resistance and leads to the fixation of a new habit . . . but no direct evidence for synaptic resistance has ever been obtained. The hypothesis is not based upon neurological data but is merely a restatement of the observed fact that increased efficiency follows repeated performance . . . Familiar learning curves are obviously an expression of these successive integrations and we have no knowledge of the conditions prevailing in formation of a new simple neural integration. (On the other hand,) the instantaneous character of simpler associations in man . . . suggests that . . . a single performance serves to fix the habit. Even if this were the case for every simple reintegration within the nervous system, we should still get the appearance of gradual improvement through practice because of the formation of many simple associations . . . The fact of gradual improvement in complex functions cannot therefore be taken as evidence for a gradual wearing down of synaptic resistance by repeated passage of the nerve impulse' (Lashley, 1924). The reemergence of this theory in Hebb's monograph as a neuropsychological postulate is documented and evaluated. In the fourth edition of Hebb's Textbook (1994), Donderi refers to the postulate as Hebb's rule. Today, it is frequently referred to as the Hebb synapse. 5. Next, 'Lashley's Memory Mechanisms', considers: i. Lashley's view that the memory trace is reduplicated in the cerebral cortex and the implications of that view on the interpretation of cerebral localization. In 1952, Lashley wrote 'I have never been able by any operation on the brain to destroy a specific memory' even when the memory is elicited by electrical stimulation of the part of the cerebral cortex that is subsequently removed. ii. Lashley's early introduction of the reverberatory circuit in neuropsychological theory is documented. It is important to note that Lashley never abandoned the principle of synaptic transmission in favor of a cortical field theory, as had been alleged by Hebb and others. This claim is fully documented. iii. Lashley assumed throughout his career that memory is a unitary function. He was of course aware of the distinction between long and short term memory but he never referred to the modern distinction between storage and retrieval. Nor did he ever consider associative and working memory as distinct forms of memory when he searched for the engram in the cerebral cortex. iv. Lashley's position on the continuity-discontinuity debate is reviewed as well as his championing the concept of instinct at a time when the concept was falling into disfavor in America. In 1950, Lashley championed the European ethologists' views of fixed action patterns though he himself preferred the term instinct. His article on instinct in the Encyclopaedia Britannica of 1956 is especially noteworthy. 6. The next chapter, entitled 'Issues of Priority, Opinion and Criticism', includes: i. a reconsideration of Lashley's obsession with theoretical criticism in his later years, as alleged by Hebb. ii. an interpretation of the meaning of Lashley's refusal of Hebb's offer to coauthor the 1949 volume with him. iii. the history of the reverberatory circuit in the psychological literature, and questions of priority as far as the integration of the reverberatory circuit into neuropsychological theory is concerned. iv. the fact that Lashley failed to acknowledge data and theory that were unfavorable to his views, during his search for the engram. This is documented. v. Lashley's opinion of Hebb's theories was never known because Lashley hardly ever spoke of them. But Lashley's 1947 review of Hebb's manuscript-in-progress is revealing in this regard. Revealing as well is Lashley's letter of congratulations to Hebb after publication of his 1949 monograph. Finally, the personal relationship of Lashley, the teacher, and Hebb, the student, is delineated. 7. The next chapter, titled 'Hebb's The Organization of Behavior 50 Years After Publication', offers a contemporary view of Hebb's enduring contributions to neuropsychological theory. Hebb bolstered his theories with the following facts: i. adults seem to be able to sustain brain injury with fewer permanent consequences than can infants and children; ii. learning in children is much more difficult compared to similar learning in adults. Hebb further argued that: iii. distinctions should be made between primitive unity, non-sensory figure and identity in perception; iv. stimulus equivalence and generalization are learned in early life; v. the ratio of association cortex to sensory cortex should be considered in phylogenetic development; vi. the evidence of post-tetanic potentiation supports the importance of the Hebb synapse in learning (this phenomenon was described after the 1949 monograph was published but it found its way into Hebb's later writings); vii. there is a distinction between intelligence A (innate potential) and intelligence B (achievement); viii. following Tolman, Hebb introduced a new way of thinking about neuropsychological problems in his 1960 presidential address to APA. That discourse is today named cognitive psychology; ix. later research on the stabilized retinal image supported cell assembly theory. 8. The Left and Right Cerebral Hemispheres reviews the case of Alex, a nine year old boy whose left hemisphere was removed for the relief of intractable epileptic seizures. Though he never learned to speak before surgery, Alex began to show remarkable gains in speech and language and in cognitive skills in general. Alex's postoperative achievements challenge the widely held view, shared by Hebb, that early childhood is a particularly critical period for the acquisition of cognitive skills. It must be concluded that clearly articulated, well-structured, and appropriate language can be acquired for the first time as late as nine years of age with the right hemisphere alone. Hebb and Lashley did no live to see this case. My guess is that Hebb would have had great difficulty in explaining Alex's achievements, but Lashley would have chuckled and muttered in so many words, 'You see, not only do you have reduplication of the memory trace within a hemisphere but also between hemispheres.' 9. The next chapter is entitled, 'A Comparison of Lashley and Hebb on the Concepts of Attention and Stimulus Generalization'. On the concept of attention, Lashley took off from his observations of attempted solutions in rats during the learning of the maze. It was Spence's concession on this matter that persuaded Lashley that he had bested the neo-behaviorists on the continuity-discontinuity debate. In the 1942 paper (reprinted in this volume), Lashley argued that a pattern of excitation in the cortex in the form of a reverberatory circuit may form a relatively stable and permanent foundation, modifying the effects of later stimulation, as attention determines the selection of stimuli. These ideas precede Hebb's formulation of the central autonomous process by some seven years. And then in the Vanuxem Lectures of 1952, Lashley went way beyond Hebb when he introduced the ideas of a priming or pre-setting of circuits based upon the spacing of the end-feet on the post-synaptic cell. Hebb was by far the more accomplished writer and so, with the publication of his monograph in 1949, he captured the attention of the neuropsychological community with ideas that did not differ substantially from Lashley's. 10. However, on the matter of stimulus generalization their positions were radically different. Lashley's position is nativistic stimulus generalization, if it exists at all, is built into the organism. His conception derived from his critique of the neo-Pavlovian view of a gradient in stimulus similarity underlying stimulus generalization. His discontinuity position on learning led him to write in 1946 'Stimulus generalization is generalization only in the sense of failure to note distinguishing charateristics of the stimulus or to associate them with the conditioned reaction. A definite attribute of the stimulus is abstracted and forms the basis of reaction; other attributes are either not sensed or are disregarded. So long as the effective attribute is present, the reaction is elicited as an all-or-none function of that attribute. Other characteristics of the stimulus may be radically changed without affecting the reaction' (Lashley and Wade, 1946). The neo-Pavlovian gradient of similarity on a stimulus continuum is an artifact of inattention. Such a stimulus generalization is generalization by default. 11. Hebb's concept of stimulus generalization was developed in connection with his delineation of the formation of the cell assembly underlying the perception of a triangle. Hebb's contribution was to perceptual theory. He proposed the startling idea that a simple figure like an outline triangle is not perceived as a whole, innately, as alleged by the gestaltists. He went on to show how the elements of line and angle become integrated into a unified perception of a triangle. To persuade the skeptical reader, Hebb introduced the idea of perceptual identity, something that has to be learned. He then proposed a mechanism involving neural fractionation and recruitment. Fractionation eliminates the variable cells that are excited extramacularly. Macular excitation, which is due to ocular fixation, remains constant despite the variable angular size of the stimulus object. 12. In short, stimulus generalization emerges secondarily from the slow development of each complex concept. And yet, there is some doubt regarding the universality of stimulus generalization according to Hebb. His theory cannot always predict stimulus generalization from the learning of a simple discrimination. Take for example the learning to discriminate a vertical from a horizontal line. In this case, the stimuli belong to the category of primitive unity for which, unlike the triangle, no learning is required, according to Hebb, to build a unified percept. Nevertheless, our best guess is that, empirically, after the initial learning to discriminate the two lines, the organism would show stimulus generalization to a vertical rectangle vs. a horizontal rectangle and even to a vertical row of circles vs. a horizontal row of circles. Since there is no initial learning to build a unified perception of the vertical and horizontal lines, it is difficult to see how Hebb would derive the empirical data of stimulus generalization in this case. 13. A late chapter of commentary is entitled, 'Lashley's Enduring Legacy to Neuropsychological Theory'. A contemporary perspective is offered in reviewing the concepts of vicarious functioning, equipotentiality, reduplicated memory trace, the reverberatory circuit. Lashley's lesson that synapses inactive during learning can show the effects of learning is emphasized. Lashley's lesson was never acknowledged by Hebb or any of his students. Lashley derided the use of wiring diagrams in neuropsychological theory especially those derived from computer technology. Neurons are live metabolizing cells, he argued, not inert paths like copper wires. They interact at synapses, which are not solder joints like soldered copper wires. The synaptic contacts are variable. Furthermore, synaptic contacts may be excitatory and/or inhibitory. Soldered wires are always excitatory and fixed. Both are pathways to be sure but the differences between neurons and copper wires far outnumber their similarities. Thus brain organization cannot be modelled by circuit diagrams representing inert pathways. 14. An epilogue presents a number of personal vignettes of both Lashley and Hebb. Lashley's career was reviewed earlier in some detail in Orbach (1982). The most disturbing part of this story has to do with Lashley's racism, as alleged by Weidman (1996). I would not have raised this matter in a scholarly volume concerned with Lashley's contributions to neuropsychological theory were it not for Weidman's allegation that Lashley's racist attitudes influenced his theoretical views. But, did these odious attitudes of Lashley affect his science? I can attest to Lashley's anti-African-American attitudes, but I can find no evidence that Lashley's racism colored his theories. A lifelong student of genetics, Lashley had an abiding interest in the concept of instinct and in the genetics of behavior in general. These facts must have eluded Weidman. Both Hebb and Lashley were honored many times during their lifetimes. It is especially noteworthy that Hebb was appointed Chancellor of McGill University, and that he was nominated, in 1965, for the Nobel Prize. 15. During his freshman year, at the age of 16, Lashley studied general zoology and comparative anatomy with Albert M. Reese at the University of West Virginia. Reese appointed him departmental assistant, at a salary of $0.25 per hour. One of the new assistant's first tasks was to sort out various materials in the basement. The result of this assignment can best be expressed in Lashley's own words: 'Among them I found a beautiful Golgi series of the frog brain. I took these to Reese and proposed that I draw all of the connections between the cells. Then we would know how the frog worked (sic!). It was a shock to learn that the Golgi method does not stain all cells, but I think almost ever since I have been trying to trace those connections' (Beach in Orbach, 1982). Only later did Lashley realize that functional variables such as spatial and temporal summation, excitatory and inhibitory states, and micro-movements of elements influencing synaptic contact need not be represented microscopically. The lesson is that neurons are not inert and static, like soldered wires. They are live metabolizing cells with synaptic contacts that vary. If Lashley were alive today, there is no doubt that he would continue to scold modern neuroscientists who still have not become aware of the importance of this fact. 16. Part 2 of the book consists of nine of Lashley's major theoretical papers reprinted in their entirety. These are listed in the References below. Part 2 also includes Lashley's four Vanuxem Lectures given at Princeton University in 1952, and published here for the first time. In these lectures, Lashley referred to the anatomical observations of Lorente de N and emphasized the neural net as the active neural unit in the cerebral cortex. He introduced the idea of a neural priming or presetting, concepts all highly reminiscent of Hebb's theorizing on the central autonomous process and the cell assembly. The term neural lattice was coined by Lashley in 1949. This term was discarded by Hebb in his 1949 monograph in favor of cell assembly. REFERENCES: http://www.wabash.edu/depart/psych/Courses/Psych_81/LASHLEY.HTM http://www.archives.mcgill.ca/guide/volume2/gen01.htm#HEBB, DONALD OLDING http://www.princeton.edu/~harnad/hebb.html http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.amit.html Hebb, D. O. (1949) The Organization of Behavior: a Neuropsychological Theory. New York: Wiley. Hebb, D. O. and Donderi, D.C. (1994) Textbook of Psychology, fourth edition, revised. Dubuque, Iowa: Kendall/Hunt Publishing Company. Hilgard, E. R. and Marquis, D. G. (1940) Conditioning and Learning, NY: Appleton-Century. Lashley, K. S. (1924) 'Studies of cerebral function in learning. VI. The theory that synaptic resistance is reduced by the passage of the nerve impulse.' Psychol. Rev., 31, 369-375. Lashley, K. S. (1931) 'Mass action in cerebral function.' Science 73, 245-254. Lashley, K. S. 'The problem of cerebral organization in vision.' Biol. Symp, 1942, 7, 301-322. Lashley, K. S. (1949) 'Persistent problems in the evolution of mind.' Quart. Rev. Biol., 24, 28-42. Lashley, K. S. (1950) 'In search of the engram.' In Symp. Soc. Exp. Biol. No. 4, Cambridge, Eng.,: Cambridge Univ. Press. Lashley, K. S. (1951) 'The problem of serial order in behavior.' In Jeffress, L. A. (Ed.) Cerebral mechanisms in behavior, New York, Wiley. Lashley, K. S. (1952) Vanuxem Lectures delivered at Princeton University in Feb. 1952. Untitled. Lashley, K. S. (1954) 'Dynamic processes in perception.' In Adrian, E. D. Bremer, F. and Jasper, H. H. (Eds.) Brain Mechanisms and Consciousness. Illinois, Charles C. Thomas, 422-443. Lashley, K. S. (1968) 'Cerebral organization and behavior.' In The Brain and Human Behavior, Proc. Ass. Res. Nerv. Ment. Dis., 36, 1-18. Lashley, K. S. and Wade, M. (1946) 'The Pavlovian theory of generalization.' Psychol. Rev., 53, 72-87. Lashley, K. S., Chow, K.-L, and Semmes, J., (1951) 'An examination of the electrical field theory of cerebral integration.' Psychol. Rev., 58, 123-136. Orbach, J. (1982) Neuropsychology After Lashley: Fifty Years Since the Publication of Brain Mechanisms and Intelligence. Hillsdale, NJ.: Lawrence Erlbaum Associates. Weidman, N. (1996) 'Psychobiology, progressivism, and the anti-progressive tradition.' J. Hist. Biol, 29, 267-308. From jon at syseng.anu.edu.au Mon Sep 20 04:12:32 1999 From: jon at syseng.anu.edu.au (Jonathan Baxter) Date: Mon, 20 Sep 1999 18:12:32 +1000 Subject: New paper on Direct Reinforcement Learning Message-ID: <37E5EC70.90621447@syseng.anu.edu.au> The following paper is available from http://wwwsyseng.anu.edu.au/~jon/papers/drlexp.ps.gz. It is a sequel to http://wwwsyseng.anu.edu.au/~jon/papers/drlalg.ps.gz All comments welcome. Title: Direct Gradient-Based Reinforcement Learning: II. Gradient Ascent Algorithms and Experiments Authors: Jonathan Baxter, Lex Weaver and Peter Bartlett Australian National University Abstract: In \cite{drl1} we introduced \pomdpg, an algorithm for computing arbitrarily accurate approximations to the performance gradient of parameterized partially observable Markov decision processes (\pomdps). The algorithm's chief advantages are that it requires only a single sample path of the underlying Markov chain, it uses only one free parameter $\beta\in [0,1)$ which has a natural interpretation in terms of bias-variance trade-off, and it requires no knowledge of the underlying state. In addition, the algorithm can be applied to infinite state, control and observation spaces. In this paper we present \conjgrad, a conjugate-gradient ascent algorithm that uses \pomdpg\ as a subroutine to estimate the gradient direction. \conjgrad\ uses a novel line-search routine that relies solely on gradient estimates and hence is robust to noise in the performance estimates. \olpomdp, an on-line gradient ascent algorithm based on \pomdpg\ is also presented. The chief theoretical advantage of this gradient based approach over value-function-based approaches to reinforcement learning is that it guarantees improvement in the performance of the policy at {\em every} step. To show that this advantage is real, we give experimental results in which \conjgrad\ was used to optimize a simple three-state Markov chain controlled by a linear function, a two-dimensional ``puck'' controlled by a neural network, a call admission queueing problem, and a variation of the classical ``mountain-car'' task. In all cases the algorithm rapidly found optimal or near-optimal solutions. From berthold at ICSI.Berkeley.EDU Mon Sep 20 18:23:04 1999 From: berthold at ICSI.Berkeley.EDU (Michael Berthold) Date: Mon, 20 Sep 1999 15:23:04 -0700 (PDT) Subject: new book: Intelligent Data Analysis, An Introduction Message-ID: <199909202223.PAA13641@fondue.ICSI.Berkeley.EDU> The following textbook might be of interest to readers of the Connectionist mailing list: "Intelligent Data Analysis: An Introduction" edited by Michael Berthold and David J. Hand (Springer-Verlag, 1999. ISBN 3-540-65808-4) The idea for this book arose when, through teaching classes on IDA and doing consulting work, we realized that there was no coherent textbook to which we could direct students or interested researchers and practitioners in the field. We considered writing such a book ourselves, but abandoned this idea when we realised how wide would be the range of topics which should be covered. Instead, we decided to invite appropriate experts to contribute separate chapters on various fields, and we took pains to ensure that these chapters complemented and built on each other, so that a rounded picture resulted. Our aim was that, rather than focusing on state-of-the-art research, where it is always difficult to tell which ideas will turn out to be really important, each chapter should provide a thorough introduction to its domain. The areas covered are: - Statistical Concepts (Chapter 2) - Statistical Methods (Chapter 3) - Bayesian Methods (Chapter 4) - Analysis of Time Series (Chapter 5) - Rule Induction (Chapter 6) - Neural Networks (Chapter 7) - Fuzzy Logic (Chapter 8) - Stochastic Search Methods (Chapter 9) The book begins with an introduction to the field of intelligent data analysis (Chapter 1) and concludes with a discussion of applications (Chapter 10) and a list of available tools (Appendix A). The table of contents and the preface can be accessed from the Springer (Germany) web-site at: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-65808-4 We hope that the community will find this book useful. From magnus at cs.man.ac.uk Tue Sep 21 10:09:57 1999 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Tue, 21 Sep 1999 15:09:57 +0100 Subject: PhD studentship: Riemannian geometry of neural networks and statistical models Message-ID: <37E791B5.CF5D80A3@cs.man.ac.uk> ------------------------------------------- PhD studentship: Riemannian geometry of neural networks and statistical models ------------------------------------------- Applications are sought for a three year PhD position to study various applications of Riemannian geometry in neural networks and statistical models. The position will be supported by an EPSRC studentship and based in the computer science department at Manchester University, which is one of the largest and most successful computer science departments in the UK. Living expenses will be paid according to current EPSRC rates (19635 pounds over three years) with substantial extra funding available for participation at international conferences and workshops. For more details contact: Magnus Rattray (magnus at cs.man.ac.uk) Computer Science Department, University of Manchester, Manchester M13 9PL, UK. Tel +44 161 275 6187. http://www.cs.man.ac.uk/~magnus/magnus.html Start date: Immediate From mieko at hip.atr.co.jp Wed Sep 22 00:30:09 1999 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Wed, 22 Sep 1999 13:30:09 +0900 Subject: Neural Networks 12(7&8) Message-ID: <199909220437.NAA08352@mailhost.hip.atr.co.jp> NEURAL NETWORKS SPECIAL ISSUE 1999 12(7&8) Contents - Volume 12, Number 7&8 - 1999 ------------------------------------------------------------ ARTICLES: Towards the networks of the brain: from brain imaging to consciousness J.G. Taylor What are the computations of the cerebellum, the basal gangila, and the cerebral cortex? K. Doya Sequence generation in arbitary temporal patterns from theta-nested gamma oscillations: a model of the basal ganglia-thalamo-cortical loops T. Fukai A model of computation in neocortical architecture E. Korner, M.O. Gewaltig, U. Korner, A. Richter, and T. Rodemann Architecture and dynamics of the primate prefrontal cortical circuit for spatial working memory SHOJI Tanaka Computation of pattern invariance in brain-like structures S. Ullman and S. Soloviev Unsupervised visual learning of 3D objects using a modular network architecture H. Ando, S. Suzuki, and T. Fujita Organization of face and object recognition in modular neural network models M.N. Dailey, and G.W. Cottrell On redundancy in neural architecture: dynamics of a simple module-based neural network and initial state independence K. Tsutsumi Complex behavior by means of dynamical systems for an anthropomorphic robot T. Bergener, C. Bruckhoff, P. Dahm, H. Janben, F. Joublin, R. Menzner, A. Steinhage, and W. Von Seelen Generative character of perception: a neural architecture for sensorimotor anticipation H.M. Gross, A. Heinze, T. Seiler, and V. Stephan Learning to perceive the world as articulated: an approach for hierarchical learning in sensory-motor systems J. Tani, and S. Nolfi Adaptive internal state space construction method for reinforcement learning of a real-world agent K. Samejima, and T. Omori Emergence of symbolic behavior from brain like memory with dynamic attention T. Omori, A. Mochizuki, K. Mizutani, and M. Nishizaki Internal models in the control of posture P. Morasso, L. Baratto, R. Capra, and G. Spada Temporally correlated inputs to leaky integrate-and-fire models can reproduce spiking statistics of cortical neurons Y. Sakai, S. Funahashi, and S. Shinomoto The consolidation of learning during sleep: comparing the pseudorehearsal and unlearning accounts A. Robins, and S. Mccallum \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ SPECIAL DISCOUNT for the SPECIAL ISSUE! \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ We set a special discount for the special issue! The ordinary price at US$ 60, we are offering it for US$ 30 (a 50% discount) if you order it before November 15, 1999. The contact addresses for the ordering are the same as given below. \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ______________________________ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professsor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From harnad at coglit.ecs.soton.ac.uk Thu Sep 23 15:23:03 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Thu, 23 Sep 1999 20:23:03 +0100 (BST) Subject: PSYC Call for Commentators: HYPERSTRUCTURE/BRAIN/COGNITION Message-ID: Richardson: HYPERSTRUCTURE IN BRAIN AND COGNITION http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 The target whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email or see websites for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To link to the full text of this article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc ----------------------------------------------------------------------- psycoloquy.99.10.031.hyperstructure.richardson Thu Sep 23 1999 ISSN 1055-0143 (71 pars, 60 refs, 6 figs, 1 table, 1389 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1999 Ken Richardson HYPERSTRUCTURE IN BRAIN AND COGNITION Target Article on Hyperstructure Ken Richardson Centre for Human Development & Learning The Open University Walton Hall Milton Keynes MK7 6AA United Kingdom k.richardson at open.ac.uk ABSTRACT: This target article tries to identify the informational content of experience underlying object percepts and concepts in complex, changeable environments, in a way which can be related to higher cerebral functions. In complex environments, repetitive experience of feature- and object-images in static, canonical form is rare, and this remains a problem in current theories of conceptual representation. The only reliable information available in natural experience consists of nested covariations or 'hyperstructures'. These need to be registered in a representational system. Such representational hyperstructures can have novel emergent structures and evolution into 'higher' forms of representation, such as object concepts and event- and social-schemas. Together, these can provide high levels of predictability. A sketch of a model of hyperstructural functions in object perception and conception is presented. Some comparisons with related views in the literature of the recent decades are made, and some empirical evidence is briefly reviewed. KEYWORDS: complexity, covariation, features, hypernetwork, hyperstructure, object concepts, receptive field, representation http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 From shirish at csa.iisc.ernet.in Mon Sep 27 04:05:45 1999 From: shirish at csa.iisc.ernet.in (Shirish K. Shevade) Date: Mon, 27 Sep 1999 13:35:45 +0530 (IST) Subject: TR Announcement Message-ID: Technical Report Announcement: Smola and Sch\"{o}lkopf's SMO algorithm for SVM regression is very simple and easy to implement. In a recent paper we suggested some improvements to Platt's SMO algorithm for SVM classifier design. In this report we extend those ideas to Smola and Sch\"{o}lkopf's SMO algorithm for regression. The resulting modified algorithms run much faster than the original SMO. Details are given in the Technical Report mentioned below. A gzipped postscript file containing the report can be downloaded from: http://guppy.mpe.nus.edu.sg/~mpessk/ Send any comments to: shirish at csa.iisc.ernet.in ---------------------------------------------------------------------------- Improvements to SMO Algorithm for SVM Regression Technical Report CD-99-16 S.K. Shevade, S.S. Keerthi, C. Bhattacharyya & K.R.K. Murthy Abstract This paper points out an important source of confusion and inefficiency in Smola and Sch\"{o}lkopf's Sequential Minimal Optimization (SMO) algorithm for regression that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO. These modified algorithms perform significantly faster than the original SMO on the datasets tried. ---------------------------------------------------------------------------- From simone at eealab.unian.it Mon Sep 27 02:18:11 1999 From: simone at eealab.unian.it (Simone G.O. Fiori) Date: Mon, 27 Sep 1999 08:18:11 +0200 Subject: Papers available on SOM and BSS-ICA Message-ID: <1.5.4.32.19990927061811.0067c17c@prometeo.eealab.unian.it> Dear Connectionists, the following two papers are now available: "A Review of Artificial Neural Networks Applications in Microwave Computer-Aided Design" by Pietro Burrascano, Simone Fiori, and Mauro Mongiardo University of Perugia, Perugia - Italy Abstract Neural networks found significant applications in microwave CAD. In this paper, after providing a brief description of neural networks employed so far in this context, we illustrate some of their most significant applications and typical issues arising in practical implementation. We also summarize current research tendencies and introduce use of self-organizing maps (SOM) enhancing model accuracy and applicability. We conclude considering some future developments and exciting perspectives opened from use of neural networks in microwave CAD. Keywords Artificial neural networks; Self-organizing maps; Microwave components; Filter design. Journal International Journal of RF and Microwave CAE, Vol. 9, pp. 158 -- 174, 1999 ============================================================== "Entropy Optimization by the PFANN Network: Application to Blind Source Separation" by Simone Fiori University of Perugia, Perugia - Italy Abstract The aim of this paper is to present a study of polynomial functional-link neural units that learn through an information- theoretic-based criterion. First the structure of the neuron is presented and the unsupervised learning theory is explained and discussed, with particular attention being paid to its probability density function and cimulative distribution function approximation capability. Then a neural network formed by such neurons (the polynomial functional-link artificial neural network, or PFANN) is shown to be able to separate out lienarly mixed eterokurtic source signals, i.e. signals endowed with either positive or negative kurtoses. In order to compare the performance of the proposed blind separation technique with those exhibited by existing methods, the mixture of densities (MOD) approach of Xu et al, which is closely related to PFANN, is briefly recalled; then comparative numerical simulations performed on both synthetic and real-world signals and a complexity evaluation are illustrated. These results show that the PFANN approach give similar performance with a noticeable reduction in computational effort. Journal Network: Computation in Neural Systems, Vol. 10, No. 2, pp. 171 -- 186, 1999 Requests of reprints should be addressed to: Dr. Simone Fiori Neural Networks Research Group at the Dept. of Industrial Engineering University of Perugia - Perugia, Italy Loc. Pentima bassa, 21 I-05100, TERNI E-mail: simone at eealab.unian.it, sfr at unipg.it Best regards, Simone From mike at deathstar.psych.ualberta.ca Mon Sep 27 15:28:38 1999 From: mike at deathstar.psych.ualberta.ca (Michael R.W. Dawson) Date: Mon, 27 Sep 1999 13:28:38 -0600 (MDT) Subject: Jobs at U.ofA. Message-ID: DEPARTMENT OF PSYCHOLOGY, UNIVERSITY OF ALBERTA Two positions in Computational Psychology / Computational Neuroscience or in Cognitive Engineering The Department of Psychology at the University of Alberta, is seeking to expand its program in Computational Psychology and Cognitive Engineering. Two tenure-track positions at the Assistant Professor level in Computational Psychology / Computational Neuroscience or in Cognitive Engineering will be open to competition. Appointments will be effective July 1, 2000. Candidates in Computational Psychology / Computational Neuroscience should have a strong interest in modeling and predicting human behavior, or in modeling of brain functions at the level of neurons, neuronal groups or large brain subsystems using formal approaches such as mathematical modeling, neural networks, evolutionary computing, or computer simulations. Candidates in Cognitive Engineering should have a strong interest in interaction of humans with computers, machines or complex environments, in decision-support systems in industrial, medical or emergency situations, and in the design of computer-based tools to support and enhance performance of humans in these situations. The expectation is that the successful candidates will secure competitive research funds and/or industrial support. Hiring decisions will be made on the basis of demonstrated research capability, teaching ability, potential for interactions with colleagues and fit with departmental needs. The applicant should send a curriculum vitae, a statement of current and future research plans, recent publications, and arrange to have at least three letters of reference forwarded, to: Dr Terry Caelli, Chair, Department of Psychology P220 Biological Sciences Building University of Alberta Edmonton, Alberta Canada T6G 2E9. Closing date for applications is December 1, 1999. Further information on these positions can be obtained from http://web.psych.ualberta.ca/hiring. In accordance with Canadian Immigration requirements, this advertisement is directed to Canadian Citizens and permanent residents. If suitable Canadian citizens and permanent residents cannot be found, other individuals will be considered. The University of Alberta is committed to the principle of equity in employment. As an employer we welcome diversity in the workplace and encourage applications from all qualified women and men, including Aboriginal peoples, persons with disabilities, and members of visible minorities. _____________________________________________________ Background Computational Psychology / Computational Neuroscience Computational Psychology is concerned with the generation of formal representations and algorithms for modeling, predicting and improving human behavior. Computational Neuroscience, on the other hand, is concerned with modeling brain functions different levels, at the level of single neurons, at the level of neuronal groups, and at the level of brain subsystems. Both, Cognitive Psychology and Cognitive Neuroscience, rely on a wealth of formal approaches: mathematical modeling, neural networks, evolutionary computing, computer simulations, uncertainty calculi, HMMs, etc. A hiring in this area not only strengthens the expertise in our program, but also helps to increase collaborative ties between programs (in particular with BCN) and between departments (in particular with Neuroscience and with Computing Science). Cognitive Engineering is concerned with the interaction of humans with complex environments, such as the interaction of humans with computers, machines or complex (typically industrial) environments, with decision-support systems in industrial, medical or emergency situations, with the design of computer-based tools to support and enhance performance of humans in these situations, and with methods to efficiently train humans for these situations. Cognitive Engineering relies on a variety of methods and tools, including performance assessment, spatial information systems and methods for developing computer support technologies (e.g. expert systems, uncertainty, machine learning). Cognitive Engineering has close links to Human Factors and industrial applications. -- Professor Michael R.W. Dawson | mike at bcp.psych.ualberta.ca | (780)-492-5175 Biological Computation Project, Dept. of Psychology, University of Alberta Edmonton, AB, CANADA T6G 2E9 | http://www.bcp.psych.ualberta.ca/~mike/ From sutton at research.att.com Tue Sep 28 14:18:49 1999 From: sutton at research.att.com (Rich Sutton) Date: Tue, 28 Sep 1999 14:18:49 -0400 Subject: two papers on reinforcement learning Message-ID: This is to announce the availability of two papers on reinforcement learning. -------------------------------------------------------------------------------- Policy Gradient Methods for Reinforcement Learning with Function Approximation Richard S. Sutton, David McAllester, Satinder Singh, and Yishay Mansour Accepted for presentation at NIPS'99 Function approximation is essential to reinforcement learning, but the standard approach of approximating a value function and determining a policy from it has so far proven theoretically intractable. In this paper we explore an alternative approach in which the policy is explicitly represented by its own function approximator, independent of the value function, and is updated according to the gradient of expected reward with respect to the policy parameters. Williams's REINFORCE method and actor--critic methods are examples of this approach. Our main new result is to show that the gradient can be written in a form suitable for estimation from experience aided by an approximate action-value or advantage function. Using this result, we prove for the first time that a version of policy iteration with arbitrary differentiable function approximation is convergent to a locally optimal policy. ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SMSM-NIPS99-submitted.ps.gz or ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SMSM-NIPS99-submitted.pdf -------------------------------------------------------------------------------- -------------------------------------------------------------------------------- Between MDPs and Semi-MDPs: A Framework for Temporal Abstraction in Reinforcement Learning Richard S. Sutton, Doina Precup, and Satinder Singh Accepted for publication in Artificial Intelligence (a revised version of our earlier technical report on this topic) Learning, planning, and representing knowledge at multiple levels of temporal abstraction are key, longstanding challenges for AI. In this paper we consider how these challenges can be addressed within the mathematical framework of reinforcement learning and Markov decision processes (MDPs). We extend the usual notion of action in this framework to include {\it options\/}---closed-loop policies for taking action over a period of time. Examples of options include picking up an object, going to lunch, and traveling to a distant city, as well as primitive actions such as muscle twitches and joint torques. Overall, we show that options enable temporally abstract knowledge and action to be included in the reinforcement learning framework in a natural and general way. In particular, we show that options may be used interchangeably with primitive actions in planning methods such as dynamic programming and in learning methods such as Q-learning. Formally, a set of options defined over an MDP constitutes a semi-Markov decision process (SMDP), and the theory of SMDPs provides the foundation for the theory of options. However, the most interesting issues concern the interplay between the underlying MDP and the SMDP and are thus beyond SMDP theory. We present results for three such cases: 1) we show that the results of planning with options can be used during execution to interrupt options and thereby perform even better than planned, 2) we introduce new {\it intra-option\/} methods that are able to learn about an option from fragments of its execution, and 3) we propose a notion of subgoal that can be used to improve the options themselves. All of these results have precursors in the existing literature; the contribution of this paper is to establish them in a simpler and more general setting with fewer changes to the existing reinforcement learning framework. In particular, we show that these results can be obtained without committing to (or ruling out) any particular approach to state abstraction, hierarchy, function approximation, or the macro-utility problem. ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SPS-aij.ps.gz -------------------------------------------------------------------------------- From robbie at bcs.rochester.edu Tue Sep 28 09:01:50 1999 From: robbie at bcs.rochester.edu (Robbie Jacobs) Date: Tue, 28 Sep 1999 09:01:50 -0400 (EDT) Subject: visual cue combination articles Message-ID: <199909281301.JAA08097@broca.bcs.rochester.edu> The following two articles are published in the journal Vision Research, but may be of interest to readers of this list: (1) Jacobs, R.A. (1999) Optimal integration of texture and motion cues to depth. Vision Research, 39, 3621-3629. (2) Jacobs, R.A. and Fine, I. (1999) Experience-dependent integration of texture and motion cues to depth. Vision Research, 39, 4062-4075. ========================================= (1) Jacobs, R.A. (1999) Optimal integration of texture and motion cues to depth. Vision Research, 39, 3621-3629. We report the results of a depth-matching experiment in which subjects were asked to adjust the height of an ellipse until it matched the depth of a simulated cylinder defined by texture and motion cues. On one-third of the trials the shape of the cylinder was primarily given by motion information, on one-third of the trials it was given by texture information, and on the remaining trials it was given by both sources of information. Two optimal cue combination models are described where optimality is defined in terms of Bayesian statistics. The parameter values of the models are set based on subjects' responses on trials when either the motion cue or the texture cue was informative. These models provide predictions of subjects' responses on trials when both cues were informative. The results indicate that one of the optimal models provides a good fit to the subjects' data, and the second model provides an exceptional fit. Because the predictions of the optimal models closely match the experimental data, we conclude that observers' cue combination strategies are indeed optimal, at least under the conditions studied here. Available on the web at: www.bcs.rochester.edu/bcs/people/faculty/robbie/jacobs.vr99.ps.Z ========================================= (2) Jacobs, R.A. and Fine, I. (1999) Experience-dependent integration of texture and motion cues to depth. Vision Research, 39, 4062-4075. Previous investigators have shown that observers' visual cue combination strategies are remarkably flexible in the sense that these strategies adapt on the basis of the estimated reliabilities of the visual cues. However, these researchers have not addressed how observers acquire these estimated reliabilities. This article studies observers' abilities to learn cue combination strategies. Subjects made depth judgments about simulated cylinders whose shapes were indicated by motion and texture cues. Because the two cues could indicate different shapes, it was possible to design tasks in which one cue provided useful information for making depth judgments, whereas the other cue was irrelevant. The results of Experiment One suggest that observers' cue combination strategies are adaptable as a function of training; subjects adjusted their cue combination rules to use a cue more heavily when the cue was informative on a task versus when the cue was irrelevant. Experiment Two demonstrated that experience-dependent adaptation of cue combination rules is context-sensitive. On trials with presentations of short cylinders, one cue was informative, whereas on trials with presentations of tall cylinders, the other cue was informative. The results suggest that observers can learn multiple cue combination rules, and can learn to apply each rule in the appropriate context. Experiment Three demonstrated a possible limitation on the context-sensitivity of adaptation of cue combination rules. One cue was informative on trials with presentations of cylinders at a left oblique orientation, whereas the other cue was informative on trials with presentations of cylinders at a right oblique orientation. The results indicate that observers did not learn to use different cue combination rules in different contexts under these circumstances. These results are consistent with the hypothesis that observers' visual systems are biased to learn to perceive in the same way views of bilaterally symmetric objects that differ solely by a symmetry transformation. Taken in conjunction with the results of Experiment Two, this means that the visual learning mechanism underlying cue combination adaptation is biased such that some sets of statistics are more easily learned than others. Available on the web at: www.bcs.rochester.edu/bcs/people/faculty/robbie/jacobsfine.vr99.ps.Z From RK at hirn.uni-duesseldorf.de Fri Sep 24 11:56:07 1999 From: RK at hirn.uni-duesseldorf.de (Rolf Kotter) Date: Fri, 24 Sep 1999 17:56:07 +0200 Subject: PTRS - call for submissions Message-ID: <37EB9F17.5C02383@hirn.uni-duesseldorf.de> ============================================================================ CALL FOR SUBMISSIONS Theme issue of Philosophical Transactions: Biological Sciences http://www.pubs.royalsoc.ac.uk/publish/phi_bs/ Theme: NEUROSCIENCE DATABASES - tools for exploring structure-function relationships in the brain Theme editor: Rolf Ktter Understanding the workings of systems as complex as the nervous system requires the aid of computational tools to collate, analyse and test experimental data with the aim of establishing close structure-function relationships. Over the last years many attempts have been made to construct neuroscience databases collating data about structures, small functional circuits and global functions of the nervous system. The aim of this Theme Issue is to critically review the achievements and problems of previous and current approaches, to devise future strategies and to make these insights available to everyone concerned with neuroscience databases. More specifically, papers are expected to cover one or more of the following topics: adequate representations of different types of neuroscientific data; identification of promising research fields versus problem data; arguments for representation of individual vs. summary, and raw vs. interpreted data. tools for meta-analysis of data in neuroscience databases and methods (statistical, computational, etc.) for establishing structure-function relationships in the brain. quality control of database contents: access control, peer review, links to publications strategies to improve the contents, user-friendliness, acceptance, significance and longevity of databases; desirable developments in other fields, e.g. data visualisation. lessons to be learnt from databases in fields beyond neuroscience (e.g. gene sequences, protein structure, images, references; see Human Genome Project or U.S. National Center for Biotechnology Information). technical and organisational issues of design, programming (reusable code?), maintenance, updating and cross-linking of databases and hardware platforms; support teams, financial requirements, life-cycles of databases. impact of databases on the neuroscience communities; relationship between experimentalists (data producers), data collators and data analysts: who wants and who needs databases, and how do databases affect the production and publication of data? These topics are often best addressed within the context of specific database projects, but note that it is not sufficient to simply present your database project. Finally, although the contents of the contributed papers can be quite specialised, their concept, thrust and significance should be intelligible to interested non-specialists. SCHEDULE All submissions will be subject to a rigorous review process. The Theme Issue will contain 10-15 refereed papers, which are going to be invited on the basis of abstract submissions. 1 November 1999 Submission of an abstract and a tentative title declaring intention to submit a full paper by the deadline given below. This should be done by e-mail to RK at hirn.uni-duesseldorf.de After selection process and invitation of full papers: 31 May 2000 Deadline for receipt of full paper in three copies. After referees' comments and (if necessary) revisions: 31 December 2000 Finalisation of all papers for publication of theme issue in 2001. ADDRESS FOR SUBMISSIONS AND CORRESPONDENCE Dr. Rolf Kotter C. & O. Vogt Brain Research Institute, Bldg. 22.03 Heinrich Heine University, Universitatsstr. 1 D-40225 Dusseldorf, Germany phone + fax: +49-211-81-12095 e-mail: RK at hirn.uni-duesseldorf.de http://www.hirn.uni-duesseldorf.de/~rk ============================================================================ From abla at gatsby.ucl.ac.uk Mon Sep 27 07:10:36 1999 From: abla at gatsby.ucl.ac.uk (abla@gatsby.ucl.ac.uk) Date: Mon, 27 Sep 1999 12:10:36 +0100 Subject: New Data Fusion MSc Message-ID: <3.0.6.32.19990927121036.00942d70@axon.gatsby.ucl.ac.uk> To all participants of the Gatsby Neural Computation Tutorial: Given your interest in neural networks, you (or your colleagues) might be interested to know about the new MSc in data fusion (a technology which makes use of neural networks). This was announced at the recent FUSION'99 conference held in Sunnyvale, USA . It is the first such course in the world and generated great interest in the international data fusion community. This much-needed course is being run by the School of Computing at the University of Central England in Birmingham. The course lasts for one year starting in January 2000 and is aimed at allowing people in employment to continue to work with minimal disruption. The first part of the course comprises taught modules delivered either during an intensive week or on a 1-day a week basis over the semester. The course is designed in such a way that anybody may attend the one-week "Introduction to Data Fusion" module without taking the full MSc.) The second part of the course involves carrying out an appropriate research project culminating in a dissertation. It is hoped that many students will identify a practical data fusion problem within their own company on which they can work to the benefit of both themselves and their company. Further details can be found on the internet at http://www.cis.uce.ac.uk/faculty/comput/courses/msc_DFroute.htm or by contacting John Perkins, MSc Course Director by email at john.perkins at uce.ac.uk or by phone on 0121 331 6209. If you want some informal information please email me at jane.obrien at datafusion.clara.co.uk. Jane O'Brien Visiting Fellow of the Faculty of Computing, Information and Computing University of Central England From espaa at exeter.ac.uk Mon Sep 27 10:51:47 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Mon, 27 Sep 1999 15:51:47 +0100 (GMT Daylight Time) Subject: PAA issue 2(3) contents Message-ID: Pattern Analysis & Applications Springer Verlag Ltd. http://www.dcs.exeter.ac.uk/paa (JOURNAL WEBSITE) http://link.springer.de/link/service/journals/10044/index.htm (SPRINGER ELECTRONIC SERVICE) ISSN: 1433-7541 (printed version) ISSN: 1433-755X (electronic version) Table of Contents Vol. 2 Issue 3 L. P. Cordella, P. Foggia, C. Sansone, F. Tortorella, M. Vento: Reliability Parameters to Improve Combination Strategies in Multi-Expert Systems Pattern Analysis & Applications 2 (1999) 3, 205-214 P. Foggia, C. Sansone, F. Tortorella, M. Vento: Definition and Validation of a Distance Measure Between Structural Primitives Pattern Analysis & Applications 2 (1999) 3, 215-227 Z. Lou, K. Liu, J. Y. Yang, C. Y. Suen: Rejection Criteria and Pairwise Discrimination of Handwritten Numerals Based on Structural Features Pattern Analysis & Applications 2 (1999) 3, 228-238 J. Y. Goulermas, P. Liatsis: Incorporating Gradient Estimations in a Circle-Finding Probabilistic Hough Transform Pattern Analysis & Applications 2 (1999) 3, 239-250 J. G. Keller, S. K. Rogers, M. Kabrisky, M. E. Oxley: Object Recognition Based on Human Saccadic Behaviour Pattern Analysis & Applications 2 (1999) 3, 251-263 __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From cindy at cns.bu.edu Tue Sep 28 10:51:47 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Tue, 28 Sep 1999 10:51:47 -0400 Subject: call for papers: ICCNS 2000 Message-ID: <199909281451.KAA16678@retina.bu.edu> ***** CALL FOR PAPERS ***** FOURTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 24, 2000 Meeting: May 25-27, 2000 Boston University 677 Beacon Street Boston, Massachusetts 02215 http://cns-web.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems This interdisciplinary conference has drawn about 300 people from around the world each time that it has been offered. Last year's conference was attended by scientists from 30 countries. The conference is structured to facilitate intense communication between its participants, both in the formal sessions and during its other activities. As during previous years, the millennium conference will focus on solutions to the fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * cognition, planning, and attention * other Contributed abstracts must be received, in English, by January 28, 2000. Notification of acceptance will be provided by email by February 29, 2000. A meeting registration fee of $50 for regular attendees and $35 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 14, 2000. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and a first and second choice from among the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows may be available to cover meeting travel and living costs. This will be confirmed one way or the other, and broadly advertised if confirmed, before the deadline to apply for fellowship support, which will be January 28, 2000. Applicants will be notified by email by February 29, 2000. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Fourth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 24, 2000 Meeting: May 25-27, 2000 FAX: (617) 353-7755 http://cns-web.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $75 Conference plus Tutorial (Regular) ( ) $50 Conference plus Tutorial (Student) ( ) $50 Conference Only (Regular) ( ) $35 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From S.Singh at exeter.ac.uk Wed Sep 29 06:14:48 1999 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Wed, 29 Sep 1999 11:14:48 +0100 (GMT Daylight Time) Subject: MPhil in CS (Financial Forecasting using Neural Networks) Message-ID: MPhil in Computer Science UNIVERSITY OF EXETER SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science Applications are now invited for an MPhil studentship in the area of "Financial Forecasting using Neural Networks". The project will develop student skills in areas including neural networks, financial forecasting, and pattern recognition. The studentship is in collaboration with Siebe Appliance Controls Ltd at Plymouth. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of Unix operating system. The studentships cover UK/EEC fees and maintenance over two years. The successful candidates should expect to take up the studentships no later than 1 November, 1999. Applicants should send a CV, including the names and addresses of two referees, to Dr Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (s.singh at exeter.ac.uk). Applicants should ask their referees to directly send their references to the above address. Informal enquiries can be made at +44-1392-264053. -------------------------------------------- Sameer Singh Director, PANN Research Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer -------------------------------------------- From nat at cs.dal.ca Wed Sep 29 11:28:51 1999 From: nat at cs.dal.ca (Nathalie Japkowicz) Date: Wed, 29 Sep 1999 12:28:51 -0300 (ADT) Subject: Tesis + Papers Announcement Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. Dissertation and of a few related papers. Regards, Nathalie. ----------------------------------------------------------------------- Thesis: ------- Title: "Concept-Learning in the Absence of Counter-Examples: An Autoassociation-Based Approach to Classification" Advisors: Stephen Jose Hanson and Casimir A. Kulikowski URL: http://borg.cs.dal.ca/~nat/Research/thesis.ps.gz Abstract: -------- The overwhelming majority of research currently pursued within the framework of concept-learning concentrates on discrimination-based learning. Nevertheless, this emphasis can present a practical problem: there are real-world engineering problems for which counter-examples are both scarce and difficult to gather. For these problems, recognition-based learning systems are much more appropriate because they do not use counter-examples in the concept-learning phase and thus require fewer counter-examples altogether. The purpose of this dissertation is to analyze a promising connectionist recognition-based learning system--- autoassociation-based classification---and answer the following questions raised by a preliminary comparison of the autoassociator and its discrimination counterpart, the Multi-Layer Perceptron (MLP), on three real-world domains: * What features of the autoassociator make it capable of performing classification in the absence of counter-examples? * What causes the autoassociator to be significantly more efficient than MLP in certain domains? * What domain characteristics cause the autoassociator to be more accurate than MLP and MLP to be more accurate than the autoassociator? A study of the two systems in the context of these questions yields the conclusions that 1) Autoassociation-based classification is possible in a particular class of practical domains called non-linear and multi-modal because the autoassociator uses a multi-modal specialization bias to compensate for the absence of counter-examples. This bias can be controlled by varying the capacity of the autoassociator. 2) The difference in efficiency between the autoassociator and MLP observed on this class of domains is caused by the fact that the autoassociator uses a (fast) bottom-up generalization strategy whereas MLP has recourse to a (slow) top-down one, despite the fact that the two systems are both trained by the backpropagation procedure. 3) The autoassociator classifies more accurately than MLP domains requiring particularly strong specialization biases caused by the counter-conceptual class or particularly weak specialization biases caused by the conceptual class. However, MLP is more accurate than the autoassociator on domains requiring particularly strong specialization biases caused by the conceptual class. The results of this study thus suggest that recognition-based learning, which is often dismissed in favor of discrimination-based ones in the context of concept-learning, may present an interesting array of classification strengths. ------------------------------------------------------------------------ Related Papers: --------------- * "Nonlinear Autoassociation is not Equivalent to PCA" , Japkowicz, N., Hanson S.J., and Gluck, M.A. in Neural Computation (in press). Abstract: --------- A common misperception within the Neural Network community is that even with nonlinearities in their hidden layer, autoassociators trained with Backpropagation are equivalent to linear methods such as Principal Component Analysis (PCA). The purpose of this paper is to demonstrate that nonlinear autoassociators actually behave differently from linear methods and that they can outperform these methods when used for latent extraction, projection and classification. While linear autoassociators emulate PCA and thus exhibit a flat or unimodal reconstruction error surface, autoassociators with nonlinearities in their hidden layer learn domains by building error reconstruction surfaces that, depending on the task, contain multiple local valleys. This particular interpolation bias allows nonlinear autoassociators to represent appropriate classifications of nonlinear multi-modal domains, in contrast to linear autoassociators which are inappropriate for such tasks. In fact, autoassociators with hidden unit nonlinearities can be shown to perform nonlinear classification and nonlinear recognition. URL: http://borg.cs.dal.ca/~nat/Papers/neuralcomp.ps.gz * "Adaptability of the Backpropagation Procedure" , Japkowicz, N. and Hanson S.J., in the proceedings of the 1999 International Joint Conference in Neural Networks (IJCNN-99) . Abstract: --------- Possible paradigms for concept learning by feedforward neural networks include discrimination and recognition. An interesting aspect of this dichotomy is that the recognition-based implementation can learn certain domains much more efficiently than the discrimination-based one, despite the close structural relationship between the two systems. The purpose of this paper is to explain this difference in efficiency. We suggest that it is caused by a difference in the generalization strategy adopted by the Backpropagation procedure in both cases: while the autoassociator uses a (fast) bottom-up strategy, MLP has recourse to a (slow) top-down one, despite the fact that the two systems are both optimized by the Backpropagation procedure. This result is important because it sheds some light on the nature of Backpropagation's adaptative capability. From a practical viewpoint, it suggests a deterministic way to increase the efficiency of Backpropagation-trained feedforward networks. URL: http://borg.cs.dal.ca/~nat/Papers/ijcnn-5.ps.gz * "Are we Better off without Counter Examples" , Japkowicz, N., in the proceedings of the 1999 conference on Advances in Intelligent Data Analysis (AIDA-99). Abstract: --------- Concept-learning is commonly implemented using discrimination-based techniques which rely on both examples and counter-examples of the concept. Recently, however, a recognition-based approach that learns a concept in the absence of counter-examples was shown to be more accurate than its discrimination counterpart on two real-world domains and as accurate on the third. The purpose of this paper is to find out whether this recognition- based approach is generally more accurate than its discrimination counterpart or whether the results it obtained previously are purely coincidental. The analysis conducted in this paper concludes that the results obtained on the real-world domains were not coincidental, and this suggests that recognition-based approaches are promising techniques worth studying in greater depth. URL: http://borg.cs.dal.ca/~nat/Papers/accuracy.ps.gz * "A Novelty Detection Approach to Classification" , Japkowicz, N., Myers, C. & Gluck, M., in the proceedings of the Fourteenth International Joint Conference on Artificial Intelligence (IJCAI-95). pp. 518-523. Abstract: --------- Novelty Detection techniques are concept-learning methods that proceed by recognizing positive instances of a concept rather than differentiating between its positive and negative instances. Novelty Detection approaches consequently require very few, if any, negative training instances. This paper presents a particular Novelty Detection approach to classification that uses a Redundancy Compression and Non-Redundancy Differentiation technique based on the Gluck & Myers model of the hippocampus, a part of the brain critically involved in learning and memory. In particular, this approach consists of training an autoencoder to reconstruct positive input instances at the output layer and then using this autoencoder to recognize novel instances. Classification is possible, after training, because positive instances are expected to be reconstructed accurately while negative instances are not. The purpose of this paper is to compare HIPPO, the system that implements this technique, to C4.5 and feedforward neural network classification on several applications. URL: http://borg.cs.dal.ca/~nat/Papers/ijcai95_final.ps.gz -- Nathalie Japkowicz, Ph.D. Assistant Professor Faculty of Computer Science DalTech/Dalhousie University 6050 University Avenue Halifax, Nova Scotia Canada, B3H 1W5 e-mail: nat at cs.dal.ca Homepage: http://borg.cs.dal.ca/~nat From schubert at sto.foa.se Thu Sep 30 03:27:15 1999 From: schubert at sto.foa.se (Johan Schubert) Date: Thu, 30 Sep 1999 09:27:15 +0200 Subject: On web: Clustering Belief Functions (Dempster-Shafer Theory) Message-ID: <990930092719.ZM24839@atlas.sto.foa.se> Clustering Belief Functions (Dempster-Shafer Theory) ---------------------------------------------------- My papers on clustering belief functions, etc., are now available on the web with URL: http://www.foa.se/fusion/ Publications Schubert, J., Simultaneous Dempster-Shafer clustering and gradual determination of number of clusters using a neural network structure. In Proceedings of the 1999 Information, Decision and Control Conference (IDC'99), Adelaide, Australia, 8-10 February 1999. IEEE, Piscataway, 1999, pp. 401-406. Schubert, J., A neural network and iterative optimization hybrid for Dempster-Shafer clustering. In Proceedings of EuroFusion98 International Conference on Data Fusion (EF'98), M. Bedworth, J. O'Brien (Eds.), Great Malvern, UK, 6-7 October 1998, pp. 29-36. Schubert, J., Fast Dempster-Shafer clustering using a neural network structure. In Proceedings of the Seventh International Conference on Information Processing and Management of Uncertainty in Knowledge-based Systems (IPMU'98), Universit? de La Sorbonne, Paris, France, 6-10 July 1998. Editions EDK, Paris, 1998, pp. 1438-1445. Bergsten, U., Schubert, J. and Svensson, P., Applying Data Mining and Machine Learning Techniques to Submarine Intelligence Analysis. In Proceedings of the Third International Conference on Knowledge Discovery and Data Mining (KDD'97), D. Heckerman, H. Mannila, D. Pregibon, R. Uthurusamy (Eds.), Newport Beach, USA, 14-17 August 1997. The AAAI Press, Menlo Park, pp. 127-130. Schubert, J., Creating Prototypes for Fast Classification in Dempster-Shafer Clustering. In Qualitative and Quantitative Practical Reasoning, D. M. Gabbay, R. Kruse, A. Nonnengart, H. J. Ohlbach (Eds.), Proceedings of the First International Joint Conference on Qualitative and Quantitative Practical Reasoning (ECSQARU-FAPR'97), Bad Honnef, Germany, 9-12 June 1997. Springer-Verlag (LNAI 1244), Berlin, 1997, pp. 525-535. Schubert, J., Specifying nonspecific evidence. International Journal of Intelligent Systems 11(8), 525-563, 1996. Schubert, J., On Rho in a Decision-Theoretic Apparatus of Dempster-Shafer Theory. International Journal of Approximate Reasoning 13(3), 185-200, 1995. (FOA-B--95-00097-3.4--SE, Defence Research Establishment, 1995) Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory for an Evidential Intelligence Analysis of MultipleTarget Tracks (Thesis Abstract). AI Communications 8(2) (1995) 107-110. Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory. In Symbolic and Quantitative Approaches to Reasoning and Uncertainty, C. Froidevaux and J. Kohlas (Eds.), Proceedings of the European Conference on Symbolic and Quantitative Approaches to Reasoning and Uncertainty (ECSQARU'95), Universit? de Fribourg, Switzerland, 3-5 July 1995. Springer-Verlag (LNAI 946), Berlin, 1995, pp. 395-404. Schubert, J., Finding a Posterior Domain Probability Distribution by Specifying Nonspecific Evidence. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 3(2) (1995) 163-185. Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory for an Evidential Intelligence Analysis of Multiple Target Tracks, Ph.D. Thesis, TRITA-NA-9410, ISRN KTH/NA/R--94/10--SE, ISSN 0348-2952, ISBN 91-7170-801-4. Royal Institute of Technology, Sweden, 1994. Bergsten, U. and Schubert, J., Dempster's Rule for Evidence Ordered in a Complete Directed Acyclic Graph. International Journal of Approximate Reasoning 9(1) (1993) 37-73. Schubert, J., On Nonspecific Evidence. International Journal of Intelligent Systems 8(6) (1993) 711-725. All papers are available as postscript files, most are also available as pdf files [except for my 1994 Ph.D. thesis which is only available in hard copy by post upon request (no charge): schubert at sto.foa.se]. Sincerely, Johan Schubert Department of Data and Information Fusion Defence Research Establishment, Sweden E-mail: schubert at sto.foa.se From bogus@does.not.exist.com Thu Sep 30 09:10:53 1999 From: bogus@does.not.exist.com () Date: Thu, 30 Sep 1999 15:10:53 +0200 Subject: CFP: ESANN'2000 European Symposium on Artificial Neural Networks Message-ID: ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | First announcement and call for papers | ---------------------------------------------------- Technically co-sponsored by the IEEE Neural Networks Council, the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. The call for papers for the ESANN'2000 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2000 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2000 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical aspects, approximation of functions, classification, control, time-series prediction, statistics, signal processing, vision, self-organization, vector quantization, evolutive learning, psychological computations, biological plausibility, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are also encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: o theory o models and architectures o mathematics o learning algorithms o vector quantization o self-organization o RBF networks o Bayesian classification o recurrent networks o support vector machines o time series forecasting o adaptive control o statistical data analysis o independent component analysis o signal processing o approximation of functions o cellular neural networks o fuzzy neural networks o natural and artificial vision o hybrid networks o identification of non-linear dynamic systems o biologically plausible artificial networks o bio-inspired systems o neurobiological systems o cognitive psychology o adaptive behaviour o evolutive learning Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. o Self-organizing maps for data analysis J. Lampinen, K. Kaski, Helsinki Univ. of Tech. (Finland) o Time-series prediction J. Suykens, J. Vandewalle, K.U. Leuven (Belgium) o Artificial neural networks and robotics R. Duro, J. Santos Reyes, Univ. da Coruna (Spain) o Support Vector Machines C. Campbell, Bristol Univ. (UK), J. Suykens, K.U. Leuven (Belgium) o Neural networks and statistics W. Duch, Nicholas Copernicus Univ. (Poland) o Neural network in medicine T. Villmann, Univ. Leipzig (Germany) o Artificial neural networks for energy management systems G. Joya, Univ. de Malaga (Spain) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in an hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference, or in another one (50 m. from the first one) at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Call for contributions ---------------------- Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee ---------------------------- Fran?ois Blayo Pr?figure (F) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Henri Leich Fac. Polytech. Mons (B) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Edoardo Amaldi Politecnico di Milano (I) Agn?s Babloyantz Univ. Libre Bruxelles (B) Herv? Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt Univ. Lille II & UCL Louv.-la-N. (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Dresden Univ. of Technology (D) Stan Gielen Univ. of Nijmegen (NL) Manuel Grana UPV San Sebastian (E) Anne Gu?rin-Dugu? INPG Grenoble (F) Martin Hasler EPFL Lausanne (CH) Laurent H?rault CEA-LETI Grenoble (F) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinky Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Eddy Mayoraz Motorola Palo Alto (USA) Jean Arcady Meyer Univ. Pierre et Marie Curie - Paris 6 (F) Jos? Mira UNED (E) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Gilles Pag?s Univ. Pierre et Marie Curie - Paris 6 (F) Thomas Parisini Politecnico di Milano (I) H?l?ne Paugam-Moisy Univ. Lumi?re Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Leonardo Reyneri Politecnico di Torino (I) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) John Stonham Brunel University (UK) Johan Suykens KUL Leuven (B) John Taylor King?s College London (UK) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) From mw at stat.Duke.EDU Thu Sep 30 10:40:35 1999 From: mw at stat.Duke.EDU (Mike West) Date: Thu, 30 Sep 1999 10:40:35 -0400 Subject: Faculty Positions Available At Duke University Message-ID: <19990930104035.B3972@isds.duke.edu> Dear Colleague I would appreciate your assistance in bringing the vacancies below to the attention of potential candidates, and in forwarding the ad to your departmental colleagues. Thanks. Mike W =========================================================== Mike West Arts & Sciences Professor of Statistics & Decision Sciences Director, Institute of Statistics & Decision Sciences Duke University, Durham, NC 27708-0251. USA tel/fax: (919) 684-8842/8594 http://www.stat.duke.edu =========================================================== ********************************************** STATISTICS AND BIOSTATISTICS FACULTY POSITIONS Institute of Statistics & Decision Sciences DUKE UNIVERSITY *************** Duke University has openings for tenured and tenure-track faculty, to begin in Fall 2000. We invite applications and nominations for the positions detailed below. (a) Full professor in the Institute of Statistics and Decision Sciences (ISDS), and (b) Assistant professor in Biostatistics in the School of Medicine, with a joint appointment in ISDS. Suitable applicants for appointment as tenured Professor of Statistics and Decision Sciences will be recognised research leaders in statistics. We are particularly interested in hearing from potential applicants in Bayesian statistics and related areas, and with disciplinary interests in biomedical applications. In collaboration with other departments at Duke, ISDS is developing a range of activities in statistical genetics and bioinformatics more broadly, and so particularly encourages applicants whose applied interests relate to these areas. Applications and nominations should be sent to Mike West, Director, ISDS, Duke University, Durham NC 27708-0251. Applications received by January 15th 2000 will be guaranteed full consideration. Appointment at the assistant professor level will be tenure track in the Division of Biometry in the School of Medicine, with a secondary appointment in ISDS. The appointee will work on cancer-related research projects and cancer clinical trials in the Biostatistics Unit of the Duke Comprehensive Cancer Center, and will have teaching and research roles in both ISDS and Biometry. A suitable applicant will hold a PhD in statistics or biostatistics, and have evident potential for excellence in research in biomedical statistics and quality teaching. Some background in areas involving collaborative medical research, clinical trials and interactions with medical research investigators will be beneficial. Applicants should mail cv and letter of application, and arrange for three letters of reference to be sent to, the Faculty Search Committee, Box 3958, Duke University Medical Center, Durham, NC 27710. Applications received by January 15th 2000 will be guaranteed full consideration. Additional appointments in biostatistics, including non-tenure/research track positions, may be available. Applications from suitably qualified women and minority candidates, for each of the above positions, are particularly encouraged. Duke University is an Equal Opportunity/Affirmative Action Employer. ****************************************************** Further information is available at the ISDS web site: http://www.stat.duke.edu ****************************************************** From radu_d at fred.EECS.Berkeley.EDU Thu Sep 30 13:45:54 1999 From: radu_d at fred.EECS.Berkeley.EDU (Radu Dogaru) Date: Thu, 30 Sep 1999 10:45:54 -0700 (PDT) Subject: Paper on a compact, simple and efficient neural architecture In-Reply-To: Message-ID: Dear Connectionists, The following paper is available and can be downloaded from http://trixie.eecs.berkeley.edu/~radu_d/publications.html#p or http://trixie.eecs.berkeley.edu/~radu_d/dogaru_ijcnn99.pdf All comments welcome. Perceptrons Revisited: The Addition of a Non-monotone Recursion Greatly Enhances their Representation and Classification Properties Radu Dogaru, Marinel Alangiu, Matthias Rychetsky and Manfred Glesner Abstract In this paper we describe a novel neural architecture and compare its representation and classification performances with classic solutions. It combines linear units with a compact and simple to implement non-linear transform defined as a finite recursion of simple non-monotonic functions. When such a nonlinear recursion replaces the standard output function of a perceptron-like structure, the representation capability of Boolean functions enhances beyond that of the standard linear threshold gates and arbitrary Boolean functions can be learned. For example the realization of Parity function with 8 inputs requires only 8 synapses and 3 nonlinear units. While the use of nonlinear recursion at the output accounts for compact learning and memorization of arbitrary functions, it was found that good generalization capabilities are obtained when the nonlinear recursion is placed at the inputs. It is thus concluded that the proper addition of a simple nonlinear structure to the well known linear perceptron removes most of its drawbacks, the resulting architecture being compact, easy to implement, and functionally equivalent to more sophisticated neural systems. --------------------------------------------------------- Dr. Radu Dogaru c/o Prof. Leon O. Chua University of California at Berkeley Department of Electrical Engineering and Computer Science Cory Hall #1770 Berkeley, CA 94720 - 1770 Tel: (510) 643-8868 Fax: (510) 643-8869 E-mail: radu_d at fred.EECS.Berkeley.EDU http://trixie.eecs.berkeley.edu/~radu_d _________________________________________________________ From fayyad at MICROSOFT.com Thu Sep 30 14:21:07 1999 From: fayyad at MICROSOFT.com (Usama Fayyad) Date: Thu, 30 Sep 1999 11:21:07 -0700 Subject: SIGKDD Explorations: call for paper for vol. 1 issue 2 Message-ID: This is to announce that the second issue of SIGKDD Explorations, the official newsletter of the ACM's new Special Interest Group (SIG) on Knowledge Discovery and Data Mining will be published by the end of the year. The first issue is available online at http://research.microsoft.com/datamine/SIGKDD. SIGKDD Explorations newsletter is sent to the ACM SIGKDD membership and to a world-wide network of libraries. The ACM SIGKDD is a new special interest group and has grown to over 1000 members in its first 6 months. We invite submissions to the second issue which will be published by year end. We are particularly interested in getting short research and survey articles on various aspects of data mining and KDD. Submissions can be made in any one of the following categories. - survey/tutorial articles (short) on important topics not exceeding 20 pages - topical articles on problems and challenges - well-articulated position papers - technical articles not exceeding 15 pages. - news items on the order of 1-3 paragraphs - Brief announcements not exceeding 5 lines in length. - review articles of products and methodologies not exceeding 20 pages - reviews/summaries from conferences, panels and special meetings. - reports on relevant meetings and committees related to the field Submissions should be made to fayyad at acm.org or sunita at cs.berkeley.edu. All submissions must arrive by October 20, 1999 for inclusion in the next issue. Please provide URL if there is associated web information. Some words about the SIGKDD newsletter: -------------------------------------- SIGKDD Explorations is a bi-annual newsletter dedicated to serve the SIGKDD membership and community. Our goal is to make SIGKDD Newsletter a very informative, rapid publication, and interesting forum for communicating with SIGKDD community. Submissions will be reviewed by the editor and/or associate editors as apporpriate. The distribution will be very wide (on the web, to all members but probably not restricted access the first year, and to ACM's world-wide network of libraries. Members get e-mail notifications of new issues and get hardcopies if they desire). For more information on SIGKDD visit http://www.acm.org/sigkdd and for more information on the newsletter visit http://research.microsoft.com/datamine/SIGKDD. Usama Fayyad, Editor-in-Chief fayyad at acm.org Sunita Sarawagi, Associate Editor sunita at cs.berkeley.edu From mkearns at research.att.com Wed Sep 1 10:41:36 1999 From: mkearns at research.att.com (Michael J. Kearns) Date: Wed, 1 Sep 1999 10:41:36 -0400 (EDT) Subject: Paper on TD convergence available Message-ID: <199909011441.KAA11369@radish.research.att.com> The following paper is now available at http://www.research.att.com/~mkearns/papers/tdlambda.ps.Z ``Bias-Variance'' Error Bounds for Temporal Difference Updates Michael Kearns Satinder Singh AT&T Labs We give the first rigorous upper bounds on the error of temporal difference ($\td$) algorithms for policy evaluation as a function of the amount of experience. These upper bounds prove exponentially fast convergence, with both the rate of convergence and the asymptote strongly dependent on the length of the backups $k$ or the parameter $\lambda$. Our bounds give formal verification to the long-standing intuition that $\td$ methods are subject to a ``bias-variance'' trade-off, and they lead to schedules for $k$ and $\lambda$ that are predicted to be better than any fixed values for these parameters. We give preliminary experimental confirmation of our theory for a version of the random walk problem. From oby at cs.tu-berlin.de Thu Sep 2 10:33:01 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 2 Sep 1999 16:33:01 +0200 (MET DST) Subject: No subject Message-ID: <199909021433.QAA08015@pollux.cs.tu-berlin.de> Dear Connectionists, attached please find abstracts and preprint-locations of four papers about: 1. the application of Gold et al.'s (1995) matching method to the measurement of flow fields in fluid dynamics 2. Bayesian transduction 3. ICA and optical recording of brain activity 4. the role of cortical competition in visual cortical information processing Cheers Klaus =============================================================================== A new particle tracking algorithm based on deterministic annealing and alternative distance measures M. Stellmacher and K. Obermayer CS Department, Technical University of Berlin, Germany We describe a new particle tracking algorithm for the interrogation of double frame single exposure data which is obtained with particle image velocimetry. The new procedure is based on an algorithm which has recently been proposed by Gold et al. (1995) for solving point matching problems in statistical pattern recognition. For a given interrogation window, the algorithm simultaneously extracts: (1) the correct correspondences between particles in both frames and (2) an estimate of the local flow-field parameters. Contrary to previous methods, the algorithm determines not only the local velocity, but other local components of the flow field, for example rotation and shear. This makes the new interrogation method superior to standard methods in particular in regions with high velocity gradients (e.g. vortices or shear flows). We perform benchmarks with three standard particle image velocimetry (PIV) and particle tracking velocimetry (PTV) methods: cross-correlation, nearest neighbour search, and image relaxation. We show that the new algorithm requires less particles per interrogation window than cross-correlation and allows for much higher particle densities than the other PTV methods. Consequently, one may obtain the velocity field at high spatial resolution even in regions of very fast flows. Finally, we find that the new algorithm is more robust against out-of-plane noise than previously proposed methods. http://ni.cs.tu-berlin.de/publications/#journals to appear in: Experiments in Fluids ------------------------------------------------------------------------------- Bayesian Transduction Graepel, R. Herbrich, and K. Obermayer CS Department, Technical University of Berlin, Germany Transduction is an inference principle that takes a training sample and aims at estimating the values of a function at given points contained in the so-called working sample. Hence, transduction is a less ambitious task than induction which aims at inferring a functional dependency on the whole of input space. As a consequence, however, transduction provides a confidence measure on single predictions rather than classifiers, a feature particularly important for risk-sensitive applications. We consider the case of binary classification by linear discriminant functions (perceptrons) in kernel space. From the transductive point of view, the infinite number of perceptrons is boiled down to a finite number of equivalence classes on the working sample each of which corresponds to a polyhedron in parameter space. In the Bayesian spirit the posteriori probability of a labelling of the working sample is determined as the ratio between the volume of the corresponding polyhedron and the volume of version space. Then the maximum posteriori scheme recommends to choose the labelling of maximum volume. We suggest to sample version space by an ergodic billiard in kernel space. Experimental results on real world data indicate that Bayesian Transduction compares favourably to the well-known Support Vector Machine, in particular if the posteriori probability of labellings is used as a confidence measure to exclude test points of low confidence. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ------------------------------------------------------------------------------- Application of blind separation of sources to optical recording of brain activity H. Schner^1, M. Stetter^1, I. Schiel^1, J. Mayhew^2, J. Lund^3, N. McLoughlin^3, and K. Obermayer^1 1: CS Department, Technical University of Berlin, Germany 2: AIVRU, University of Sheffield, UK 3: Institute of Ophthalmology, UCL, UK In the analysis of data recorded by optical imaging from intrinsic signals (measurement of changes of light reflectance from cortical tissue) the removal of noise and artifacts such as blood vessel patterns is a serious problem. Often bandpass filtering is used, but the underlying assumption that a spatial frequency exists, which separates the mapping component from other components (especially the global signal), is questionable. Here we propose alternative ways of processing optical imaging data, using blind source separation techniques based on the spatial decorrelation of the data. We first perform benchmarks on artificial data in order to select the way of processing, which is most robust with respect to sensor noise. We then apply it to recordings of optical imaging experiments from macaque primary visual cortex. We show that our BSS technique is able to extract ocular dominance and orientation preference maps from single condition stacks, for data, where standard post-processing procedures fail. Artifacts, especially blood vessel patterns, can often be completely removed from the maps. In summary, our method for blind source separation using extended spatial decorrelation is a superior technique for the analysis of optical recording data. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ------------------------------------------------------------------------------- Recurrent cortical competition: Strengthen or weaken? P. Adorjn, L. Schwabe, C. Piepenbrock, and K. Obermayer CS Department, Technical University of Berlin, Germany We investigate the short term dynamics of recurrent competition and neural activity in the primary visual cortex in terms of information processing and in the context of orientation selectivity. We propose that after stimulus onset, the strength of the recurrent excitation decreases due to fast synaptic depression. As a consequence, the network is shifted from an initially highly nonlinear to a more linear operating regime. Sharp orientation tuning is established in the first highly competitive phase. In the second and less competitive phase, precise signaling of multiple orientations and long range modulation, e.g., by intra- and inter-areal connections becomes possible (surround effects). Thus the network first extracts the salient features from the stimulus, and then starts to process the details. We show that this signal processing strategy is optimal if the neurons have limited bandwidth and their objective is to transmit the maximum amount of information in any time interval beginning with the stimulus onset. http://ni.cs.tu-berlin.de/publications/#conference to be presented at NIPS 1999 ================================================================================ Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From Annette_Burton at Brown.edu Thu Sep 2 16:00:47 1999 From: Annette_Burton at Brown.edu (Annette Burton) Date: Thu, 2 Sep 1999 16:00:47 -0400 Subject: No subject Message-ID: A non-text attachment was scrubbed... Name: not available Type: multipart/alternative Size: 1891 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/7c07ccb7/attachment-0001.bin From wiskott at itb.biologie.hu-berlin.de Tue Sep 7 05:42:57 1999 From: wiskott at itb.biologie.hu-berlin.de (Laurenz Wiskott) Date: Tue, 7 Sep 1999 11:42:57 +0200 Subject: Bibliographies Message-ID: <199909070942.LAA00972@monod.biologie.hu-berlin.de> Dear all, I have compiled several bibliographies on computational models and algorithms related to vision and neural networks, some of which might be of interest to you. They also contain many links to online-documents and author's homepages. You can access the bibliographies via my homepage http://itb.biologie.hu-berlin.de/~wiskott/homepage.html (or http://www.cnl.salk.edu/~wiskott/homepage.html). Any kind of (constructive) feedback is welcome. Best regards, Laurenz Wiskott. BIBLIOGRAPHIES The approximate number of references and the support level are given in brackets. Invariances in Neural Systems (175, mixed) Learning Invariances (51, medium - 1998) Cortical and Artificial Neural Maps (135, mixed) Receptive Field Development (4, low) Cortical Map Analysis (8, low) Cortical Map Formation (106, high - 1999) Artificial Neural Maps (15, low) Face Processing (196, mixed) Facial Feature Finding (30, low) Face Coding and Animation (59, low) Face Analysis (34, low) Face Recognition (95, medium - 1999) Dynamic Link Matching (13, medium - 1999) Visual Motion Processing (615, mixed) Depth from Stereo (71, low) Optical Flow Estimation (248, medium - 1998) Image Motion Analysis (262, low) Segmentation from Motion (106, medium - 1998) Visual Tracking (48, low) Video Coding (71, low) -- Laurenz Wiskott, Innovationskolleg Theoretische Biologie, Berlin http://itb.biologie.hu-berlin.de/~wiskott/ wiskott at itb.biologie.hu-berlin.de From fritz at neuro.informatik.uni-ulm.de Wed Sep 8 11:46:31 1999 From: fritz at neuro.informatik.uni-ulm.de (Fritz Sommer) Date: Wed, 8 Sep 99 17:46:31 +0200 Subject: Job opening in fMRI analysis/modeling Message-ID: <9909081546.AA13802@neuro.informatik.uni-ulm.de> Research Position (cognitive/computational neuroscience) A position (beginning Nov 1999, 2 years, 1 year extension possible) is available at the University of Ulm in an interdiscplinary research project on analysis and modeling of functional magnetic resonance data. In this joint project of the departments of Psychiatry, Radiology and Neural Information Processing a method of detection and interpretation of functional/effective connectivity in fMRI data will be developed and will be applied to working memory tasks. Candidates should have a background in statistical methods, functional MRI analysis or computational neuroscience. Required is a recent masters degree or equivalent in computer science, physics, mathematics or in a closely related area. The research can be conducted as part of a PhD thesis degree in Computer Science. Experience in programming in C in a Unix environment is necessary, experience with MATLAB and SPM is helpful. Salary according to BAT IIa. The University of Ulm is an equal opportunity employer and emphatically encourages female scientists to apply. Employment will be effected through the "Zentrale Universitaetsverwaltung" of the University of Ulm. Please send CV, letter of motivation and addresses of three referees to: Prof. Dr. Dr. M. Spitzer, Department of Psychiatry III, University of Ulm, Leimgrubenweg 12, 89075 Ulm, Germany or e-mail to manfred.spitzer at medizin.uni-ulm.de. For more detailed information on the research project please contact Dr. F. T. Sommer, email: fritz at neuro.informatik.uni-ulm.de From cindy at cns.bu.edu Wed Sep 8 10:10:45 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 8 Sep 1999 10:10:45 -0400 Subject: Neural Networks 12(6) Message-ID: <199909081410.KAA07821@retina.bu.edu> NEURAL NETWORKS 12(6) Contents - Volume 12, Number 6 - 1999 NEURAL NETWORKS LETTERS: Improving support vector machine classifiers by modifying kernel functions S. Amari and S. Wu ARTICLES: *** Neuroscience and Neuropsychology *** Self-organization of shift-invariant receptive fields Kunihiko Fukushima Faithful representations with topographic maps M.M. van Hulle A learning algorithm for oscillatory cellular neural networks C.Y. Ho and H. Kurokawa *** Mathematical and Computational Analysis *** Properties of learning of a Fuzzy ART variant M. Georgiopoulos, I. Dagher, G.L. Heileman, and G. Bebis Morphological bidirectional associative memories G.X. Ritter, J.L. Diaz-de-Leon, and P. Sussner The basins of attraction of a new Hopfield learning rule A.J. Storkey and R. Valabregue *** Technology and Applications *** Bayesian neural networks for classification: How useful is the evidence framework? W.D. Penny and S.J. Roberts Conformal self-organization for continuity on a feature map C.Y. Liou and W.-P. Tai Design of trellis coded vector quantizers using Kohonen maps Chi-Sing Leung and Lai-Wan Chan An information theoretic approach for combining neural network process models D.V. Sridhar, E.B. Bartlett, and R.C. Seagrave Inferential estimation of polymer quality using bootstrap aggregated neural networks J. Zhang ______________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 600 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.ida.his.se/ida/enns JNNS Membership c/o Professsor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ************************* From hadley at cs.sfu.ca Wed Sep 8 19:23:56 1999 From: hadley at cs.sfu.ca (Bob Hadley) Date: Wed, 8 Sep 1999 16:23:56 -0700 (PDT) Subject: Computational Power and Limits of ANNs Message-ID: <199909082323.QAA00295@css.css.sfu.ca> URL: www.cs.sfu.ca/~hadley/online.html ~~~~~~~~~ Paper Available ~~~~~~~~ COGNITION AND THE COMPUTATIONAL POWER OF CONNECTIONIST NETWORKS by Robert F. Hadley School of Computing Science and Cognitive Science Program Simon Fraser University Burnaby, B.C., V5A 1S6 Canada hadley at cs.sfu.ca ABSTRACT This paper examines certain claims of ``cognitive significance'' which (wisely or not) have been based upon the theoretical powers of three distinct classes of connectionist networks, namely, the ``universal function approximators'', recurrent finite-state simulation networks, and Turing equivalent networks. Each class will be considered with respect to its potential in the realm of cognitive modeling. Regarding the first class, I argue that, contrary to the claims of some influential connectionists, feed-forward networks do NOT possess the theoretical capacity to approximate all functions of interest to cognitive scientists. For example, they cannot approximate many important, recursive (halting) functions which map symbolic strings onto other symbolic strings. By contrast, I argue that a certain class of recurrent networks (i.e., those which closely approximate deterministic finite automata, DFA) shows considerably greater promise in some domains. However, from a cognitive standpoint, difficulties arise when we consider how the relevant recurrent networks could acquire the weight vectors needed to support DFA simulations. These difficulties are severe in the realm of central high-level cognitive functions. In addition, the class of Turing equivalent networks is here examined. It is argued that the relevance of such networks to cognitive modeling is seriously undermined by their reliance on infinite precision in crucial weights and/or node activations. I also examine what advantages these networks might conceivably possess over and above classical symbolic algorithms. For, from a cognitive standpoint, the Turing equivalent networks present difficulties very similar to certain classical algorithms; they appear highly contrived, their structure is fragile, and they exhibit little or no noise-tolerance. (21 Pages -- 1.5 spacing ) Available by web at: www.cs.sfu.ca/~hadley/online.html From barba at cvs.rochester.edu Thu Sep 9 13:42:19 1999 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Thu, 9 Sep 1999 13:42:19 -0400 Subject: Faculty Position Open Message-ID: Assistant Professor in Visual Neuroscience. The University of Rochester has available a tenure-track position for a neuroscientist working in the visual system. The successful candidate will have a primary appointment in the Department of Brain and Cognitive Sciences (http://www.bcs.rochester.edu ) and will be a member of the Center for Visual Science (http://www.cvs.rochester.edu), a strong, university-wide community of 27 faculty engaged in vision research. Applicants should submit a curriculum vitae, a brief statement of research and teaching interests, reprints and three reference letters to: David R. Williams, Director Center for Visual Science, University of Rochester, Rochester, NY 14627-0270. Application review begins December 1, 1999. The University of Rochester is an equal opportunity employer ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara N. Arnold Administrator email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 University of Rochester fax: 716 271 3043 Meliora Hall 274 Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From mgeorg at SGraphicsWS1.mpe.ntu.edu.sg Wed Sep 15 16:00:52 1999 From: mgeorg at SGraphicsWS1.mpe.ntu.edu.sg (Georg Thimm) Date: Fri, 10 Sep 1999 11:12:52 -12848 Subject: Events on Artificial Intelligence (moved to a new location!) Message-ID: <199909100312.LAA09802@SGraphicsWS1.mpe.ntu.edu.sg> ----------------------------------------- WWW page for Announcements of Conferences, Workshops and Other Events on Artificial Intelligence ----------------------------------------- This WWW page allows you to look up and enter announcements for conferences, workshops, and other events concerned with neural networks, inductive learning, genetic algorithms, data mining, agents, applications of AI, pattern recognition, vision, and related fields. ------------------------------------------------------------------------- Search and lookup can be restricted to events with forthcoming deadlines! Digests for events entered the last 2, 5, 10 or 30 days available! ------------------------------------------------------------------------- The frequently updated events lists, contains currently more than 130 forthcoming events, and can be accessed via the URL: http://www.drc.ntu.edu.sg/users/mgeorg/enter.epl The entries are ordered chronologically and presented in a format for fast and easy lookup of: - date and place of the events, - titles of the events, - contact addresses (surface mail, email, ftp, and WWW address, as well as telephone or fax number), and - deadlines for submissions, registration, etc. Conference organizers are kindly asked to enter their conference into the database: http://www.drc.ntu.edu.sg/users/mgeorg/NN-events.epl . The list is in parts published in the journal Neurocomputing by Elsevier Science B.V. Information on passed conferences are also available. Kind Regards, Georg Thimm P.S. You are welcome to distribute this announcement to related mailing lists. From hiro at ladyday.kyoto-su.ac.jp Fri Sep 10 02:18:47 1999 From: hiro at ladyday.kyoto-su.ac.jp (hiro) Date: Fri, 10 Sep 1999 15:18:47 +0900 (JST) Subject: JPSTH paper Message-ID: <199909100618.PAA00576@ladyday.kyoto-su.ac.jp> The following paper has been accepted for publication in Neural Computation and is available from http://www.kyoto-su.ac.jp/~hiro/jpsth_rev3.pdf ----------- Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Department of Information and Communication Sciences, Faculty of Engineering, Kyoto Sangyo University, Kita-ku, Kyoto 603-8555, Japan and CREST, Japan Science and Technology. ABSTRACT Multineuronal recordings have enabled us to examine context- dependent changes in the relationship between the activities of multiple cells. The Joint Peri-Stimulus Time Histogram (JPSTH) is a much-used method for investigating the dynamics of the interdependence of spike events between pairs of cells. Its results are often taken as an estimate of interaction strength between cells, independent of modulations in the cells' firing rates. We evaluate the adequacy of this estimate by examining the mathematical structure of how the JPSTH quantifies an interaction strength after excluding the contribution of firing rates. We introduce a simple probabilistic model of interacting point processes to generate simulated spike data, and show that the normalized JPSTH incorrectly infers the temporal structure of variations in the interaction parameter strength. This occurs because, in our model, the correct normalization of firing rate contributions is different to that used in Aertsen et al.'s ``effective connectivity" model. This demonstrates that firing rate modulations cannot be corrected for in a model-independent manner; and therefore the effective connectivity does not represent a universal characteristic that is independent of modulation of the firing rates. Aertsen et al.'s effective connectivity may still be used in the analysis of experimental data, provided we are aware that this is simply one of many ways of describing the structure of interdependence. We also discuss some measure-independent characteristics of the structure of interdependence. ------------ Regards. Hiroyuki Ito Dept. of Information & Communication Sci. Faculty of Engineerings Kyoto Sangyo University JAPAN e-mail: hiro at ics.kyoto-su.ac.jp From morten at compute.it.siu.edu Fri Sep 10 14:18:36 1999 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Fri, 10 Sep 1999 13:18:36 -0500 (CDT) Subject: Paper announcements Message-ID: The following two papers may be of interest to the readers of this list. Both papers involve connectionist modeling of psycholinguistic data. Christiansen, M.H. & Chater, N. (1999). Toward a connectionist model of recursion in human linguistic performance. Cognitive Science, 23, 157-205. Abstract Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center-embedded and cross-dependency, and between the processing of these complex recursive structures and right-branching recursive constructions. We analyze how these differences in performance are reflected in the internal representations of the model by performing discriminant analyses on these representation both before and after training. Furthermore, we show how a network trained to process recursive structures can also generate such structures in a probabilistic fashion. This work suggests a novel explanation of people's limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion. The paper was published in the current issue of Cognitive Science. A preprint version can be downloaded from: http://www-rcf.usc.edu/~mortenc/nn-rec.html ---------------------------------------------------------------------- Christiansen, M.H. & Curtin, S.L. (1999). The power of statistical learning: No need for algebraic rules. In The Proceedings of the 21st Annual Conference of the Cognitive Science Society (pp. 114-119). Mahwah, NJ: Lawrence Erlbaum Associates. Abstract Traditionally, it has been assumed that rules are necessary to explain language acquisition. Recently, Marcus, Vijayan, Rao & Vishton (1999) have provided behavioral evidence which they claim can only be explained by invoking algebraic rules. In the first part of this paper, we show that contrary to these claims an existing simple recurrent network model of word segmentation can fit the relevant data without invoking any rules. Importantly, the model closely replicates the experimental conditions, and no changes are made to the model to accommodate the data. The second part provides a corpus analysis inspired by this model, demonstrating that lexical stress changes the basic representational landscape over which statistical learning takes place. This change makes the task of word segmentation easier for statistical learning models, and further obviates the need for lexical stress rules to explain the bias towards trochaic stress patterns in English. Together the connectionist simulations and the corpus analysis show that statistical learning devices are sufficiently powerful to eliminate the need for rules in an important part of language acquisition. The paper was published in the most recent Cognitive Science Society proceedings. An HTML version of the paper can be viewed at: http://www.siu.edu/~psycho/faculty/morten/statlearn.html And a hardcopy can be downloaded from: http://www-rcf.usc.edu/~mortenc/no-rules.html Best regards, Morten Christiansen PS: Apologies if you receive two copies of this message. ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A URL: http://www.siu.edu/~psycho/faculty/mhc.html ---------------------------------------------------------------------- From chella at unipa.it Fri Sep 10 13:33:30 1999 From: chella at unipa.it (Antonio Chella) Date: Fri, 10 Sep 1999 19:33:30 +0200 Subject: INTERNATIONAL SCHOOL ON NEURAL NETS <> Message-ID: <37D940E8.CA4E081A@unipa.it> [We apologize for multiple copies] INTERNATIONAL SCHOOL ON NEURAL NETS <> 4th Course: Subsymbolic Computation in Artificial Intelligence ERICE-SICILY: October 24-31, 1999 Motivations Autonomous intelligent agents that perform complex real world tasks must be able to build and process rich internal representations that allow them to effectively draw inferences, make decisions, and, in general, perform reasoning processes concerning their own tasks. Within the computational framework of artificial intelligence (AI) this problem has been faced in different ways. According to the classical, symbolic approach, internal representations are conceived in terms of linguistic structures, as expressions of a "language of thought". Other traditions developed approaches that are less linguistically oriented, and more biologically and anatomically motivated. It is the case of neural networks, and of self-organizing and evolutionary algorithms. Empirical results concerning natural intelligent systems suggest that such approaches are not fully incompatible, and that different kinds of representation may interact. Similarly, it can be argued that the design of artificial intelligent systems can take advantage from different kinds of interacting representations, that are suited for different tasks. In this perspective, theoretical frameworks and methodological techniques are needed, that allow to employ together in a principled way different kinds of representation. In particular, autonomous agents need to find the meaning for the symbols they use within their internal processes and in the interaction with the external world, thus overcoming the well-known symbol grounding problem. An information processing architecture for autonomous intelligent agents should exhibit processes that act on suitable intermediate levels, which are intermediary among sensory data, symbolic level, and actions. These processes could be defined in terms of subsymbolic computation paradigms, such as neural networks, self-organizing, and evolutionary algorithms. DIRECTOR OF THE COURSE: Salvatore Gaglio DIRECTOR OF THE SCHOOL: M.I.,Jordan ? M. Marinaro DIRECTOR OF THE CENTRE: A. Zichichi SCIENTIFIC SECRETARIAT: Edoardo Ardizzone, Antonio Chella, Marcello Frixione WEB PAGE OF THE SCHOOL: http://www.cere.pa.cnr.it/ScuolaErice/ ================== PROGRAM FOUNDATIONS Introduction to Artificial Intelligence L. CARLUCCI AIELLO, University of Rome "La Sapienza", Rome, Italy Neural modelling of higher order cognitive processes J. TAYLOR, King's College, London, UK Connections Models for Data Structures M. GORI, University of Siena, Siena, Italy Neural Systems Engineering I. ALEXANDER, Imperial College, London, UK ASEIT (Advanced School on Electronics and Information Technology) OPEN INTERNATIONAL WORKSHOP ON SUBSYMBOLIC TECHNIQUES AND ALGORITHMS P. GARDENFORS, Lund University, Sweden I. ALEXANDER, Imperial College, London, UK T. KOHONEN, Helsinki University of Technology, Finland J. TAYLOR, King's College, London, UK R. ARKIN, Georgia Institute of Technology, USA REPRESENTATION Conceptual Spaces P.GARDENFORS, Lund University, Sweden Topological Self Organizing Maps T. KOHONEN, Helsinki University of Technology, Finland Symbolic Representation L. CARLUCCI AIELLO, University of Rome "La Sapienza", Rome, Italy VISUAL PERCEPTION Evolutionary Processes for Artificial Perception G. ADORNI, University of Parma, Italy S. CAGNONI, University of Parma, Italy Cognitive Architectures for Artificial Intelligence M. FRIXIONE, University of Salerno S. GAGLIO, University of Palermo Algorithms for computer vision V. DI GESU', University of Palermo, Italy ACTION Motion Maps P. MORASSO, University of Genoa, Italy The self-organisation of grounded languages on autonomous robots L. STEELS, Free University of Brussels, Belgium Reinforcement Learning in Autonomous Robots C. BALKENIUS, Lund University, Sweden Behaviour-Based Robotics R. ARKIN, Georgia Institute of Technology, USA ================== APPLICATIONS Interested candidates should send a letter to the Director of the Course: Professor Salvatore GAGLIO Dipartimento di Ingegneria Automatica e Informatica Universita' di Palermo Viale delle Scienze 90128 - PALERMO - ITALY E-mail: gaglio at unipa.it They should specify: 1.date and place of birth, together with present nationality; 2.affiliation; 3.address, e-mail address. Please enclose a letter of recommendation from the group leader or the Director of the Institute or from a senior scientist. PLEASE NOTE Participants must arrive in Erice on October 24, not later than 5:00 pm. IMPORTANT The total fee, which includes full board and lodging (arranged by the School), is EURO 1000 (about 1000 USD). Thanks to the generosity of the sponsoring institutions, partial or full support can be granted to some deserving students who need financial aid. Requests to this effect must be specified and justified in the letter of application. Closing date for application: September 20, 1999 A limited number of places is available. Admission to the Workshop will be decided in consultation with the Advisory Committee of the School composed of Professors S. Gaglio, M. Marinaro, and A. Zichichi. An area for some contributed poster presentations will be available. These will be selected on the bases of an abstract of two A4 pages to be sent to the Director of the Course before September 20, 1999. From fet at socrates.berkeley.edu Fri Sep 10 17:18:05 1999 From: fet at socrates.berkeley.edu (Frederic Edouard Theunissen) Date: Fri, 10 Sep 1999 14:18:05 -0700 Subject: Postion available in quantitative Psychology Message-ID: <007c01befbd1$f8de2fc0$e0f32080@hinault.Psych.Berkeley.EDU> ******************************************************************** UNIVERSITY OF CALIFORNIA AT BERKELEY: The Department of Psychology invites applications at any level for two tenured/tenure-track positions beginning July 1, 2000. We are interested in two areas: (1) quantitative psychology (including, but not limited to, multivariate analysis, measurement, mathematical modeling, and computer modeling), and (2) social/personality psychology. Applications for the position must be postmarked by October 1, 1999, and are to include a curriculum vitae, a description of research interests and selected reprints sent to: Search Committee, Department of Psychology, 3210 Tolman Hall #1650, University of California, Berkeley, CA 94720-1650. Candidates should also arrange to have at least three letters of recommendation sent to the same address by the application date. Candidates are asked to specify the position for which they are applying, and to submit an application for each position should they wish to be considered for both. Applications postmarked after the deadline will not be considered. The University of California is an Equal Opportunity/Affirmative Action Employer. From cweber at cs.tu-berlin.de Mon Sep 13 11:40:32 1999 From: cweber at cs.tu-berlin.de (Cornelius Weber) Date: Mon, 13 Sep 1999 17:40:32 +0200 (MET DST) Subject: Paper available Message-ID: The following ICANN'99 conference paper is now available on-line. Orientation Selective Cells Emerge in a Sparsely Coding Boltzmann Machine Abstract: In our contribution we investigate a sparse coded Boltzmann machine as a model for the formation of orientation selective receptive fields in primary visual cortex. The model consists of two layers of neurons which are recurrently connected and which represent the lateral geniculate nucleus and primary visual cortex. Neurons have ternary activity values +1, -1, and 0, where the 0-state is degenerate being assumed with higher prior probability. The probability for a (stochastic) activation vector on the net obeys the Boltzmann distribution and maximum-likelihood leads to the standard Boltzmann learning rule. We apply a mean-field version of this model to natural image processing and find that neurons develop localized and oriented receptive fields. http://www.cs.tu-berlin.de/~cweber/publications/99sparseBM.ps 6 pages, 180 KB. From ken at phy.ucsf.EDU Tue Sep 14 04:18:14 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Tue, 14 Sep 1999 01:18:14 -0700 (PDT) Subject: CSH/Stony Brook Fellowships: Interdisciplinary Research in Brain Theory Message-ID: <14302.1222.260286.182929@coltrane.ucsf.edu> The Cold Spring Harbor Laboratory and The State University of New York at Stony Brook: SICN Fellowships for Interdisciplinary Research in Neuroscience The Swartz Initiative for Computational Neuroscience (SICN) announces a program to promote collaborative studies between researchers in computational neuroscience, mathematics, physics, engineering and computer sciences. The goal is to understand the algorithms and implementations that underlie brain functions. To this end, SICN intends to foster the growth of interdisciplinary research in brain theory. SUNY at Stony Brook has strong departments in neurobiology, mathematics, physical sciences and technology. CSHL had a strong program in cellular neurobiology and has made a growing commitment in theoretical neuroscience. Scientists will be be hired at the post-doctoral level to work in association with faculty or, if deemed appropriate, independently. The salary will be highly competitive and those selected will be eligible for cintinuing support from SICN. Candidates should submit (to Jonathan Wallach at the address below) a one-page summary of their research interests and goals, a CV, and the names of three academic references. For full consideration, applications should be received by October 15th, 1999. For further information about these positions please contact: Jonathan Wallach, Director The Swartz Foundation 535 Life Sciences SUNY at Stony Brook Stony Brook, NY 11794-5230, email: wallach at swartzneuro.org, tel: 516 632 4179. Visit the Swartz Foundation web site at www.swartzneuro.org for complete information about this program. From ken at phy.ucsf.EDU Tue Sep 14 04:34:39 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Tue, 14 Sep 1999 01:34:39 -0700 (PDT) Subject: Paper available: Subregion Correspondence Model of Binocular Simple Cells Message-ID: <14302.2207.469424.704224@coltrane.ucsf.edu> The following paper is now available at ftp://ftp.keck.ucsf.edu/pub/ken/dispar.ps.gz (compressed postscript) ftp://ftp.keck.ucsf.edu/pub/ken/dispar.ps (postscript) http://www.keck.ucsf.edu/~ken (click on 'Publications') This is a preprint of an article that appeared as Journal of Neuroscience 19:7212-7229 (1999): http://www.jneurosci.org/cgi/content/abstract/19/16/7212 ------------------------------ The Subregion Correspondence Model of Binocular Simple Cells Ed Erwin and Kenneth D. Miller Dept. of Physiology, UCSF ABSTRACT: We explore the hypothesis that binocular simple cells in cat areas 17 and 18 show subregion correspondence, defined as follows: within the region of overlap of the two eye's receptive fields, their ON subregions lie in corresponding locations, as do their OFF subregions. This hypothesis is motivated by a developmental model (Erwin and Miller, 1998) that suggested that simple cells could develop binocularly matched preferred orientations and spatial frequencies by developing subregion correspondence. Binocular organization of simple cell receptive fields is commonly characterized by two quantities: interocular position shift, the distance in visual space between the center positions of the two eye's receptive fields; and interocular phase shift, the difference in the spatial phases of those receptive fields, each measured relative to its center position. The subregion correspondence hypothesis implies that interocular position and phase shifts are linearly related. We compare this hypothesis with the null hypothesis, assumed by most previous models of binocular organization, that the two types of shift are uncorrelated. We demonstrate that the subregion correspondence and null hypotheses are equally consistent with previous measurements of binocular response properties of individual simple cells in the cat and other species, and with measurements of the distribution of interocular phase shifts vs. preferred orientations or vs. interocular position shifts. However, the observed tendency of binocular simple cells in the cat to have ``tuned excitatory'' disparity tuning curves with preferred disparities tightly clustered around zero (Fischer and Kruger, 1979; Ferster, 1981; LeVay and Voigt, 1988) follows naturally from the subregion correspondence hypothesis, but is inconsistent with the null hypothesis. We describe tests that could more conclusively differentiate between the hypotheses. The most straightforward test requires simultaneous determination of the receptive fields of groups of 3 or more binocular simple cells. -------------------------------- Ken Kenneth D. Miller telephone: (415) 476-8217 Dept. of Physiology fax: (415) 476-4929 UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From protopap at panteion.gr Tue Sep 14 14:36:20 1999 From: protopap at panteion.gr (Athanassios Protopapas) Date: Tue, 14 Sep 1999 21:36:20 +0300 (EET DST) Subject: Paper announcement: Conn Speech Perception Message-ID: Dear colleagues, I would like to bring to your attention my recent review paper titled "connectionist modeling of speech perception" published in Psychological Bulletin 125(4):410-436. You may find it interesting and perhaps useful for a graduate level course as it attempts to bring together the connectionist and speech literature requiring no substantial prior understanding in either one beyong a general psychology background. If you do not have access to a personal or library subscription to Psychological Bulletin you may contact me for a photocopy of the article. The abstract is: Connectionist models of perception and cognition, including the process of deducing meaningful messages from patterns of acoustic waves emitted by vocal tracts, are developed and refined as our understanding of brain function, psychological processes, and the properties of massively parallel architectures advances. In the present article, several important contributions from diverse points of view in the area of connectionist modeling of speech perception are presented and their relative merits discussed with respect to specific theoretical issues and empirical findings. TRACE, the Elman/Norris net, and Adaptive Resonance Theory constitute pivotal points exemplifying overall modeling success, progress in temporal representation, and plausible modeling of learning, respectively. Other modeling efforts are presented for the specific insights they offer and the article concludes with a discussion of computational versus dynamic modeling of phonological processes. Your comments will also be greatly appreciated. Thanassi Protopapas -- Athanassios Protopapas, PhD Department of Educational Technology Phone: +30 1 680 0959 Institute for Language and Speech Processing Fax: +30 1 685 4270 Epidavrou & Artemidos 6, Marousi e-mail: protopap at ilsp.gr GR-151 25 ATHENS, Greece From hiro at ladyday.kyoto-su.ac.jp Wed Sep 15 00:06:16 1999 From: hiro at ladyday.kyoto-su.ac.jp (hiro@ladyday.kyoto-su.ac.jp) Date: Wed, 15 Sep 1999 13:06:16 +0900 Subject: jpsth paper (updated) Message-ID: <199909150406.NAA08066@ladyday.kyoto-su.ac.jp> Dear Connectionists: Sorry for disturbing you again. But I got mails informing of FONTS problems in reading my preprint. I updated the file so that everyone can read the file. I am sorry for such an inconvenience. Hiroyuki Ito Faculty of Engineering, Kyoto Sangyo University, Kyoto hiro at ics.kyoto-su.ac.jp --- my previous message ---- The following paper has been accepted for publication in Neural Computation and is available from http://www.kyoto-su.ac.jp/~hiro/jpsth_rev3.pdf Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Department of Information and Communication Sciences, Faculty of Engineering, Kyoto Sangyo University, Kita-ku, Kyoto 603-8555, Japan and CREST, Japan Science and Technology From mclennan at cs.utk.edu Wed Sep 15 16:51:04 1999 From: mclennan at cs.utk.edu (Bruce MacLennan) Date: Wed, 15 Sep 1999 16:51:04 -0400 Subject: CFP/field computation Message-ID: <199909152051.QAA07609@maclennan.cs.utk.edu> Dear Colleagues: As program co-chair of 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE (Atlantic City, February 27 - March 3, 2000), I am organizing a special session on FIELD COMPUTATION (continuum-limit neural computation). (The general conference Call for Papers is attached.) A number of groups are now working in this area, and the time is ripe to gather in one place and compare results. If you have been working in this area, I hope you will consider submitting a paper to this session. Although the conference CFP lists Sep 1 as the deadline for receipt of summaries, we will continue to receive them for this special session through Sep 30. However, if you are interested in participating but cannot meet this deadline, please let me know and I'll see what we can arrange. Best wishes and thank you, Bruce MacLennan Department of Computer Science The University of Tennessee Knoxville, TN 37996-1301 PHONE: (423)974-5067 FAX: (423)974-4404 EMAIL: maclennan at cs.utk.edu URL: http://www.cs.utk.edu/~mclennan [sic] ===================================================================== Call for Papers 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Trump Taj Mahal Casino and Resort]], Atlantic City, NJ USA February 27 -- March 3, 2000 Summary Submission Deadline: September 1, 1999 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. http://www.ee.duke.edu/JCIS/ ***Added plenary speakers***: Marvin Minsky and Brian Josephson Plenary Speakers include the following: +------------------------------------------------------------------------+ |James Anderson |Wolfgang Banzhaf |B. Chandrasekaran|Lawrence J. Fogel| |-----------------+------------------+-----------------+-----------------| |Walter J. Freeman|David E. Goldberg |Irwin Goodman |Stephen Grossberg| |-----------------+------------------+-----------------+-----------------| |Thomas S.Huang |Janusz Kacprzyk |A. C. Kak |Subhash C. Kak | |-----------------+------------------+-----------------+-----------------| |John Mordeson |Kumpati S. Narenda|Anil Nerode |Huang T. Nguyen | |-----------------+------------------+-----------------+-----------------| |Jeffrey P. Sutton|Ron Yager | | | +------------------------------------------------------------------------+ Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o DNA Computing o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary Submission Deadline: September 1, 1999 Notification of authors upon review: November 1, 1999 December 1, 1999 - Deadline for invited sessions and exhibition proposals Papers will be accepted based on summaries. A summary shall not exceed 4 pages of 10-point font, double-column, single-spaced text, with figures and tables included. For the Fourth ICCIN, send 3 copies of summaries to: George M. Georgiou Computer Science Department California State University San Bernardino, CA 92407-2397 U.S.A. georgiou at csci.csusb.edu From Paolo.Gaudiano at artificial-life.com Wed Sep 15 12:46:21 1999 From: Paolo.Gaudiano at artificial-life.com (Paolo Gaudiano) Date: Wed, 15 Sep 1999 12:46:21 -0400 Subject: FINAL CALL FOR PARTICIPATION in BOSTON, October 3-6 Message-ID: FINAL ANNOUNCEMENT: UI-CANCS'99 to take place in Boston on October 3-6. [You will likely receive multiple copies of this. That's because this is such a great conference at such a low price that we think everyone should know about it before it's too late :-). Sorry about the clutter.] What: USA-Italy Conference on Applied Neural and Cognitive Sciences When: October 3-6, 1999 Where: Boston University Sherman Union, 775 Commonwealth Ave, Boston. Web: www.usa-italy.org Come hear outstanding members of industry and academia discuss state-of-the-art research and applications in intelligent agents, robotics, smart sensors, artificial intelligence, biomedical engineering and other cutting-edge technologies. Thanks to the generous support of Artificial Life, Inc. and of the Italian Ministry of Foreign Affairs, we are able to offer extremely low registration rates---even including meals and a tour of local area labs (space limited). Please visit our web site to see the exciting line-up we have planned and for additional details. If you have any questions please send e-mail to . From smagt at dlr.de Thu Sep 16 04:25:00 1999 From: smagt at dlr.de (Patrick van der Smagt) Date: Thu, 16 Sep 1999 10:25:00 +0200 (MET DST) Subject: 3 PhD job openings at DLR, Oberpfaffenhofen, Germany Message-ID: <199909160825.KAA03547@ilz.robotic> The Robotics neuro-group at the German Aerospace Center in Oberpfaffenhofen, Germany, has three Ph.D. position openings on the following subjects: * Learning changing environments in large neural networks * Multiresolution neural networks for fast and accurate learning * Active vision using statistical inference neural networks Visit http://www.robotic.dlr.de/LEARNING/jobs/ for more information. Condensed versions of these job descriptions follow: ---------------------------------------------------------------------- Learning changing environments in large neural networks Goal of this research project is the exploration and development of learning methodologies (optimization) for large-scale neural networks. Secondly, the effect of incremental learning in such networks is to be investigated. Close cooperation with the vision and DLR Four-Finger-Hand groups in our department are important in the development and application of the methodologies. We are looking for a candidate to join our currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. The desired candidate has a strong background in mathematics as well as statistics, and knows her way around programming in C++ and Mathematica. ---------------------------------------------------------------------- Multiresolution neural networks for fast and accurate learning Goal of this Ph.D. research is the further development of a multiresolution approximation method in the application of on-line learning of high-dimensional data, using MARS as well as the Nested Network as a starting-point. An important issue is the applicability to high-dimensional problems. The theoretical and implementational aspects of this projects are of equal importance; as a result, the algorithm should be implemented as a real-time shape-from-shading task, as dynamics control in complex hand-eye coordination tasks, and for learning data resulting from grasping tasks using the DLR Four-Finger-Hand, which is developed and available in the DLR Robotics group. For pursuing research in this area we are looking for an oudstanding qualifying candidate to join or currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. The research project has a major computer science component; therefore, the applicant ideally comes from a CS background, while having a strong foothold in statistics. Knowledge of function approximation with neural networks are a plus. Prerequisite is comfortability with a UNIX environment and knowledge of C++. ---------------------------------------------------------------------- Active Vision using statistical inference neural networks The goal of this Ph.D. research is to develop a methodology based on statistical inference neural networks, which can be used in active vision control. The theoretical and implementational aspects are of equal importance; in the end, a system should result which can be used on the DLR Four-Finger-Hand in combination with the DLR Light-Weight robot, both of which are developed and available in the DLR Robotics Department. The resulting system should eventually cooperate with a grasping methodology which is currently in development. For pursuing research in this exciting field we are looking for an outstanding candidate to join our currently expanding neuro-group in the DLR Robotics Department in Oberpfaffenhofen, Germany. In-depth knowledge and practice in applied mathematics and strong programming skills are mandatory. Experience is expected in at least one of the following areas: * statistical modeling, statistical inference * data mining * artificial neural networks/machine learning * image processing ---------------------------------------------------------------------- Please direct all application material (CV/resume, xeroxed diplomas, letters of reference, where available reprints of articles) or further questions to Dr. Patrick van der Smagt Institute of Robotics and Systems Dynamics DLR Oberpfaffenhofen D-82334 Wessling Phone +49 8153 281152 Fax +49 8153 281134 Email smagt at dlr.de Start of appointments: immediate Salary: conform half BAT IIa Each of the appointments are limited for a period of 3 years. Standard EU regulations apply. -- Dr Patrick van der Smagt phone +49 8153 281152, fax -34 DLR/Institute of Robotics and System Dynamics smagt at dlr.de P.O.Box 1116, 82230 Wessling, Germany http://www.robotic.de/Smagt/ From aapo at james.hut.fi Thu Sep 16 10:17:52 1999 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Thu, 16 Sep 1999 17:17:52 +0300 (EEST) Subject: papers on ICA Message-ID: <199909161417.RAA34929@james.hut.fi> Dear Connectionists, the following papers on extensions of ICA can now be found on my web page. - Aapo Hyvarinen http://www.cis.hut.fi/~aapo/ --------------------------------------------------------------------- A. Hyvarinen and P. Hoyer. Topographic Independent Component Analysis. http://www.cis.hut.fi/~aapo/ps/gz/TICA.ps.gz Abstract: In ordinary independent component analysis, the components are assumed to be completely independent, and they do not necessarily have any meaningful order relationships. In practice, however, the estimated ``independent'' components are often not at all independent. We propose that this residual dependence structure could be used to define a topographic order for the components. In particular, a distance between two components could be defined using their higher-order correlations, and this distance could be used to create a topographic representation. Thus we obtain a linear decomposition into approximately independent components, where the dependence of two components is approximated by the proximity of the components in the topographic representation. --------------------------------------------------------------------- A. Hyvarinen and P. Hoyer. Emergence of phase and shift invariant features by decomposition of natural images into independent feature subspaces. (to appear in Neural Computation) http://www.cis.hut.fi/~aapo/ps/gz/NC99_complex.ps.gz Olshausen and Field (1996) applied the principle of independence maximization by sparse coding to extract features from natural images. This leads to the emergence of oriented linear filters that have simultaneous localization in space and in frequency, thus resembling Gabor functions and simple cell receptive fields. In this paper, we show that the same principle of independence maximization can explain the emergence of phase and shift invariant features, similar to those found in complex cells. This new kind of emergence is obtained by maximizing the independence between norms of projections on linear subspaces (instead of the independence of simple linear filter outputs). The norms of the projections on such `independent feature subspaces' then indicate the values of invariant features. ---------------------------------------------------------------------- (Some other new papers on ICA can be found on my publication page http://www.cis.hut.fi/~aapo/pub.html as well.) From zhaoping at gatsby.ucl.ac.uk Fri Sep 17 04:55:53 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Fri, 17 Sep 1999 09:55:53 +0100 (BST) Subject: Paper available on a model of visual search Message-ID: Title: Contextual influences in V1 as a basis for pop out and asymmetry in visual search Author: Zhaoping Li Published in Proc Natl Acad Sci, USA Volumn 96, 1999. Page 10530-10535 Available at http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html or at http://www.pnas.org/content/vol96/issue18/#PSYCHOLOGY-BS Abstract: I use a model to show how simple, bottom-up, neural mechanisms in primary visual cortex can qualitatively explain the preattentive component of complex psychophysical phenomena of visual search for a target among distracters. Depending on the image features, the speed of search ranges from fast, when a target pops-out or is instantaneously detectable, to very slow, and it can be asymmetric with respect to switches between the target and distracter objects. It has been unclear which neural mechanisms or even cortical areas control the ease of search, and no physiological correlate has been found for search asymmetry. My model suggests that contextual influences in V1 play a significant role. From mdorigo at ulb.ac.be Fri Sep 17 03:21:31 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Fri, 17 Sep 1999 09:21:31 +0200 Subject: New book on Swarm Intelligence In-Reply-To: <2712.937551357@skinner.boltz.cs.cmu.edu> Message-ID: Swarm Intelligence: From Natural to Artificial Systems. Bonabeau E., M. Dorigo & G. Theraulaz (1999). New York: Oxford University Press. The book "Swarm Intelligence" provides a detailed look at models of social insect behavior and how to apply these models in the design of complex systems. In the book it is shown how these models replace an emphasis on control, preprogramming, and centralization with designs featuring autonomy, emergence, and distributed functioning. These designs are proving flexible and robust, able to adapt quickly to changing environments and to continue functioning even when individual elements fail. In particular, these designs are a novel approach to the tremendous growth of complexity in software and information. The book draws on up-to-date research from biology, neuroscience, artificial intelligence, robotics, operations research, and computer graphics, and each chapter is organized around a particular biological example, which is then used to develop an algorithm, a mutiagent systems, or a group of robots. Contents : 1. Introduction 2. Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communication Networks 3. Division of Labor and Task Allocation 4. Cemetery Organization, Brood Sorting, Data Analysis, and Graph Partitioning 5. Self-Organization and Templates: Application to Data Analysis and Graph Partitioning 6. Nest Building and Self-Assembling 7. Cooperative Transport by Insects and Robots 8. Epilogue Information on how to order the book is available at: http://www.oup-usa.org/ From allan at nagoya.riken.go.jp Fri Sep 17 11:55:17 1999 From: allan at nagoya.riken.go.jp (Allan Kardec Barros) Date: Sat, 18 Sep 1999 00:55:17 +0900 (JST) Subject: Matlab Package on ICA Message-ID: <199909171555.AAA22401@mail.bmc.riken.go.jp> A non-text attachment was scrubbed... Name: not available Type: text Size: 687 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/17cde9cd/attachment-0001.ksh From rsun at cecs.missouri.edu Fri Sep 17 10:16:37 1999 From: rsun at cecs.missouri.edu (Ron Sun) Date: Fri, 17 Sep 1999 09:16:37 -0500 Subject: IJCNN'2000 Call for Papers and Participation Message-ID: <199909171416.JAA23416@pc113.cecs.missouri.edu> ======================================================================== Call For Papers *** IJCNN-2000 *** IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS to be held in Grand Hotel di Como, Como, Italy -- July 24-27, 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society, and with the technical cooperation of the Japanese Neural Network Society, AEI (the Italian Association of Electrical and Electronic Engineers), SIREN (the Italian Association of Neural Networks), and AI*IA (the Italian Association for Artificial Intelligence) Submission deadline is 15 DECEMBER 1999. Full papers in the final form must be submitted (accepted papers will be published as submitted). Papers will be reviewed by senior researchers in the field. Acceptance/rejection will be emailed by 30 March 2000. Accepted papers will be published only if the registration form and payment for at least one of the authors are received by 30 April 2000 (see the complete call for papers for details). For the complete Call for Papers and other information (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html (The organizers may be contacted by email at ijcnn2000 at elet.polimi.it.) Important NOTICE: In the year 2000, the International Conference on Artificial Neural Networks (IJCANN), organized annually by ENNS, will not take place because it has been incorporated into IJCNN'2000. ====================================================================== Publicity Chair for IJCNN'2000: Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department phone: (573) 884-7662 University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu From harnad at coglit.ecs.soton.ac.uk Sat Sep 18 15:05:50 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sat, 18 Sep 1999 20:05:50 +0100 (BST) Subject: PSYC Call for Book Reviewers: Neuropsychology of Lashley & Hebb Message-ID: PSYCOLOQUY CALL FOR BOOK REVIEWERS Below is the Precis of "The Neuropsychological Theories of Lashley and Hebb" by Jack Orbach (427 lines). This book has been selected for multiple review in PSYCOLOQUY. If you wish to submit a formal book review please write to psyc at pucc.princeton.edu indicating what expertise you would bring to bear on reviewing the book if you were selected to review it. (If you have never reviewed for PSYCOLOQUY or Behavioral & Brain Sciences before, it would be helpful if you could also append a copy of your CV to your inquiry.) If you are selected as one of the reviewers and do not have a copy of the book, you will be sent a copy of the book directly by the publisher (please let us know if you have a copy already). Reviews may also be submitted without invitation, but all reviews will be refereed. The author will reply to all accepted reviews. Full Psycoloquy book review instructions at: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psycoloquy/ Relevant excerpts: Psycoloquy reviews are of the book not the Precis. Length should be about 200 lines [c. 1800 words], with a short abstract (about 50 words), an indexable title, and reviewer's full name and institutional address, email and Home Page URL. All references that are electronically accessible should also have URLs. AUTHOR'S RATIONALE FOR SOLICITING COMMENTARY My rationale for seeking open peer commentary is primarily that the book says some things about both Lashley and Hebb that some peers might find controversial and startling if not downright outrageous. To get these views out in the open may be of pedagogical value not only to to me but to the neuropsychological community at large. Obviously, I don't believe that my arguments are wrong or weak. But the feedback I get might conceivably persuade me to rethink the matter. psycoloquy.99.10.029.lashley-hebb.1.orbach Sat Sep 18 1999 ISSN 1055-0143 (16 paragraphs, 16 references, 427 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1999 Jack Orbach Precis of: THE NEUROPSYCHOLOGICAL THEORIES OF LASHLEY AND HEBB Precis of Orbach on Lashley-Hebb [University Press of America, 1998 xiv, 395 pp. ISBN: 0-761-81165-6] Jack Orbach Department of Psychology Queens College Flushing, NY U.S.A. jorbach at worldnet.att.net ABSTRACT: Beginning in the 1920s, K. S. Lashley startled psychologists with his theories of the memory trace within the cerebral cortex. Using terms such as terms mass action, equipotentiality, and sensory/motor equivalence, Lashley presented evidence that the engram is widely distributed in the brain, and that unactivated synapses, like activated ones, seem to show evidence of learning. His research and nativistic theories made him world famous by 1929, when he was just 39. He spent his professional career searching for a mechanism for the reduplication of the engram. While his contemporaries tried to specify the locus of the engram in the brain, Lashley found it everywhere. He liked to quip that the problem is not to find where the trace is located, but where it is not. Lashley's student, D. O. Hebb, published his empiricistic theories in 1949, in "The Organization of Behavior," and the monograph created a sensation. Hebb used Lorente de No's reverberatory circuit to provide a mechanism to maintain activity in the cerebral cortex after the stimulus terminated, the so-called central autonomous process. This led him to the cell assembly, a complex reverberatory circuit that could be assembled by experience. Changes in resistance at the synapse with learning came to be called the Hebb synapse. That monograph was highly praised for the breadth of its treatment. The present book documents how Lashley anticipated Hebb's introduction of the reverberatory circuit by some 12 years. Lashley's Vanuxem Lectures of 1952 are printed for the first time, together with nine of his previously published theoretical papers. Lashley's and Hebb's theories are reviewed and reevaluated fifty years after publication of Hebb's monograph, and a systematic effort is made to compare and contrast the views of teacher and student. KEYWORDS: cell assembly, central autonomous process, engram, equipotentiality, Hebb, Hebbian learning, Lashley, localization, memory trace, nativism, reverberatory circuit, Vanuxem Lectures 1. Part 1 of the book opens with a summary of Lashley's last public lecture given at the University of Rochester in 1957, one year before his death and eight years after the publication of Hebb's monograph. In this lecture, Lashley was still consumed with the notion of irradiating waves of excitation in the cerebral cortex, a notion he developed in detail in 1942. In citing theories of stimulus equivalence, Lashley wrote 'That of Hebb is most in accord with conditioned reflex theory. He assumes that multiple paths are developed by learning. Such learning is ruled out by a mass of evidence for innate discriminations and equivalencies.' In this unpublished address, Lashley cited Hebb's empiricistic theory for the first and only time. He never cited the monograph itself in the literature. 2. An early chapter entitled 'Setting the Stage' offers another look at Lashley's early critique of the native Watsonian connectionism of his day. Lashley's early efforts to revise and revitalize neuropsychological theory are reviewed. The problem, Lashley suggested in the 1920s, was the omission of the brain from the Watsonian S-R formula. And when a model of cortical function was finally introduced, using the analogy of the telephone switchboard, it was based on the idea of linear reflex activity in the spinal cord, as suggested by Dewey, leaving no room for psychological categories that require sustained activity in the brain such as thought, memory, emotion, motivation, selective attention and the like. And then, along came Pavlov who undercut all contemporary speculations of psychologists with his physiological theories of conditioned reflexes and brain function. It was at this point that Lashley burst upon the scene. 3. Hebb must have experienced an epiphany when he was introduced to the reverberatory circuit of Lorente de N. He realized that this anatomical curiosity provided him with a mechanism for the autonomous central process that he developed so masterfully in the 1949 monograph. Hebb's revelation involving the reverberatory circuit was especially important for it gave neurological meaning to the earlier proposals of central motive state of Morgan and central excitatory mechanism of Beach as well as the putative reduplicated memory trace of Lashley. However, Lashley had already appropriated the reverberatory circuit for neuropsychological theory in 1937, some 12 years before Hebb's monograph was published and some three years before its presentation by Hilgard and Marquis in their Conditioning and Learning of 1940. This is documented with excerpts from Lashley's papers published in 1937, 1938, 1941, 1942 and 1949. The latter two papers are republished in their entirety in this volume. 4. The next chapter deals with the learning theory that synaptic resistance is reduced by the passage of the nerve impulse. Lashley's 1924 assault on this theory is reviewed in detail. (This 1924 paper is also reprinted in this volume.) In Lashley's own words, 'Among the many unsubstantiated beliefs concerning the physiology of the learning process, none is more widely prevalent than the doctrine that the passage of the nerve impulse through the synapse somehow reduces synaptic resistance and leads to the fixation of a new habit . . . but no direct evidence for synaptic resistance has ever been obtained. The hypothesis is not based upon neurological data but is merely a restatement of the observed fact that increased efficiency follows repeated performance . . . Familiar learning curves are obviously an expression of these successive integrations and we have no knowledge of the conditions prevailing in formation of a new simple neural integration. (On the other hand,) the instantaneous character of simpler associations in man . . . suggests that . . . a single performance serves to fix the habit. Even if this were the case for every simple reintegration within the nervous system, we should still get the appearance of gradual improvement through practice because of the formation of many simple associations . . . The fact of gradual improvement in complex functions cannot therefore be taken as evidence for a gradual wearing down of synaptic resistance by repeated passage of the nerve impulse' (Lashley, 1924). The reemergence of this theory in Hebb's monograph as a neuropsychological postulate is documented and evaluated. In the fourth edition of Hebb's Textbook (1994), Donderi refers to the postulate as Hebb's rule. Today, it is frequently referred to as the Hebb synapse. 5. Next, 'Lashley's Memory Mechanisms', considers: i. Lashley's view that the memory trace is reduplicated in the cerebral cortex and the implications of that view on the interpretation of cerebral localization. In 1952, Lashley wrote 'I have never been able by any operation on the brain to destroy a specific memory' even when the memory is elicited by electrical stimulation of the part of the cerebral cortex that is subsequently removed. ii. Lashley's early introduction of the reverberatory circuit in neuropsychological theory is documented. It is important to note that Lashley never abandoned the principle of synaptic transmission in favor of a cortical field theory, as had been alleged by Hebb and others. This claim is fully documented. iii. Lashley assumed throughout his career that memory is a unitary function. He was of course aware of the distinction between long and short term memory but he never referred to the modern distinction between storage and retrieval. Nor did he ever consider associative and working memory as distinct forms of memory when he searched for the engram in the cerebral cortex. iv. Lashley's position on the continuity-discontinuity debate is reviewed as well as his championing the concept of instinct at a time when the concept was falling into disfavor in America. In 1950, Lashley championed the European ethologists' views of fixed action patterns though he himself preferred the term instinct. His article on instinct in the Encyclopaedia Britannica of 1956 is especially noteworthy. 6. The next chapter, entitled 'Issues of Priority, Opinion and Criticism', includes: i. a reconsideration of Lashley's obsession with theoretical criticism in his later years, as alleged by Hebb. ii. an interpretation of the meaning of Lashley's refusal of Hebb's offer to coauthor the 1949 volume with him. iii. the history of the reverberatory circuit in the psychological literature, and questions of priority as far as the integration of the reverberatory circuit into neuropsychological theory is concerned. iv. the fact that Lashley failed to acknowledge data and theory that were unfavorable to his views, during his search for the engram. This is documented. v. Lashley's opinion of Hebb's theories was never known because Lashley hardly ever spoke of them. But Lashley's 1947 review of Hebb's manuscript-in-progress is revealing in this regard. Revealing as well is Lashley's letter of congratulations to Hebb after publication of his 1949 monograph. Finally, the personal relationship of Lashley, the teacher, and Hebb, the student, is delineated. 7. The next chapter, titled 'Hebb's The Organization of Behavior 50 Years After Publication', offers a contemporary view of Hebb's enduring contributions to neuropsychological theory. Hebb bolstered his theories with the following facts: i. adults seem to be able to sustain brain injury with fewer permanent consequences than can infants and children; ii. learning in children is much more difficult compared to similar learning in adults. Hebb further argued that: iii. distinctions should be made between primitive unity, non-sensory figure and identity in perception; iv. stimulus equivalence and generalization are learned in early life; v. the ratio of association cortex to sensory cortex should be considered in phylogenetic development; vi. the evidence of post-tetanic potentiation supports the importance of the Hebb synapse in learning (this phenomenon was described after the 1949 monograph was published but it found its way into Hebb's later writings); vii. there is a distinction between intelligence A (innate potential) and intelligence B (achievement); viii. following Tolman, Hebb introduced a new way of thinking about neuropsychological problems in his 1960 presidential address to APA. That discourse is today named cognitive psychology; ix. later research on the stabilized retinal image supported cell assembly theory. 8. The Left and Right Cerebral Hemispheres reviews the case of Alex, a nine year old boy whose left hemisphere was removed for the relief of intractable epileptic seizures. Though he never learned to speak before surgery, Alex began to show remarkable gains in speech and language and in cognitive skills in general. Alex's postoperative achievements challenge the widely held view, shared by Hebb, that early childhood is a particularly critical period for the acquisition of cognitive skills. It must be concluded that clearly articulated, well-structured, and appropriate language can be acquired for the first time as late as nine years of age with the right hemisphere alone. Hebb and Lashley did no live to see this case. My guess is that Hebb would have had great difficulty in explaining Alex's achievements, but Lashley would have chuckled and muttered in so many words, 'You see, not only do you have reduplication of the memory trace within a hemisphere but also between hemispheres.' 9. The next chapter is entitled, 'A Comparison of Lashley and Hebb on the Concepts of Attention and Stimulus Generalization'. On the concept of attention, Lashley took off from his observations of attempted solutions in rats during the learning of the maze. It was Spence's concession on this matter that persuaded Lashley that he had bested the neo-behaviorists on the continuity-discontinuity debate. In the 1942 paper (reprinted in this volume), Lashley argued that a pattern of excitation in the cortex in the form of a reverberatory circuit may form a relatively stable and permanent foundation, modifying the effects of later stimulation, as attention determines the selection of stimuli. These ideas precede Hebb's formulation of the central autonomous process by some seven years. And then in the Vanuxem Lectures of 1952, Lashley went way beyond Hebb when he introduced the ideas of a priming or pre-setting of circuits based upon the spacing of the end-feet on the post-synaptic cell. Hebb was by far the more accomplished writer and so, with the publication of his monograph in 1949, he captured the attention of the neuropsychological community with ideas that did not differ substantially from Lashley's. 10. However, on the matter of stimulus generalization their positions were radically different. Lashley's position is nativistic stimulus generalization, if it exists at all, is built into the organism. His conception derived from his critique of the neo-Pavlovian view of a gradient in stimulus similarity underlying stimulus generalization. His discontinuity position on learning led him to write in 1946 'Stimulus generalization is generalization only in the sense of failure to note distinguishing charateristics of the stimulus or to associate them with the conditioned reaction. A definite attribute of the stimulus is abstracted and forms the basis of reaction; other attributes are either not sensed or are disregarded. So long as the effective attribute is present, the reaction is elicited as an all-or-none function of that attribute. Other characteristics of the stimulus may be radically changed without affecting the reaction' (Lashley and Wade, 1946). The neo-Pavlovian gradient of similarity on a stimulus continuum is an artifact of inattention. Such a stimulus generalization is generalization by default. 11. Hebb's concept of stimulus generalization was developed in connection with his delineation of the formation of the cell assembly underlying the perception of a triangle. Hebb's contribution was to perceptual theory. He proposed the startling idea that a simple figure like an outline triangle is not perceived as a whole, innately, as alleged by the gestaltists. He went on to show how the elements of line and angle become integrated into a unified perception of a triangle. To persuade the skeptical reader, Hebb introduced the idea of perceptual identity, something that has to be learned. He then proposed a mechanism involving neural fractionation and recruitment. Fractionation eliminates the variable cells that are excited extramacularly. Macular excitation, which is due to ocular fixation, remains constant despite the variable angular size of the stimulus object. 12. In short, stimulus generalization emerges secondarily from the slow development of each complex concept. And yet, there is some doubt regarding the universality of stimulus generalization according to Hebb. His theory cannot always predict stimulus generalization from the learning of a simple discrimination. Take for example the learning to discriminate a vertical from a horizontal line. In this case, the stimuli belong to the category of primitive unity for which, unlike the triangle, no learning is required, according to Hebb, to build a unified percept. Nevertheless, our best guess is that, empirically, after the initial learning to discriminate the two lines, the organism would show stimulus generalization to a vertical rectangle vs. a horizontal rectangle and even to a vertical row of circles vs. a horizontal row of circles. Since there is no initial learning to build a unified perception of the vertical and horizontal lines, it is difficult to see how Hebb would derive the empirical data of stimulus generalization in this case. 13. A late chapter of commentary is entitled, 'Lashley's Enduring Legacy to Neuropsychological Theory'. A contemporary perspective is offered in reviewing the concepts of vicarious functioning, equipotentiality, reduplicated memory trace, the reverberatory circuit. Lashley's lesson that synapses inactive during learning can show the effects of learning is emphasized. Lashley's lesson was never acknowledged by Hebb or any of his students. Lashley derided the use of wiring diagrams in neuropsychological theory especially those derived from computer technology. Neurons are live metabolizing cells, he argued, not inert paths like copper wires. They interact at synapses, which are not solder joints like soldered copper wires. The synaptic contacts are variable. Furthermore, synaptic contacts may be excitatory and/or inhibitory. Soldered wires are always excitatory and fixed. Both are pathways to be sure but the differences between neurons and copper wires far outnumber their similarities. Thus brain organization cannot be modelled by circuit diagrams representing inert pathways. 14. An epilogue presents a number of personal vignettes of both Lashley and Hebb. Lashley's career was reviewed earlier in some detail in Orbach (1982). The most disturbing part of this story has to do with Lashley's racism, as alleged by Weidman (1996). I would not have raised this matter in a scholarly volume concerned with Lashley's contributions to neuropsychological theory were it not for Weidman's allegation that Lashley's racist attitudes influenced his theoretical views. But, did these odious attitudes of Lashley affect his science? I can attest to Lashley's anti-African-American attitudes, but I can find no evidence that Lashley's racism colored his theories. A lifelong student of genetics, Lashley had an abiding interest in the concept of instinct and in the genetics of behavior in general. These facts must have eluded Weidman. Both Hebb and Lashley were honored many times during their lifetimes. It is especially noteworthy that Hebb was appointed Chancellor of McGill University, and that he was nominated, in 1965, for the Nobel Prize. 15. During his freshman year, at the age of 16, Lashley studied general zoology and comparative anatomy with Albert M. Reese at the University of West Virginia. Reese appointed him departmental assistant, at a salary of $0.25 per hour. One of the new assistant's first tasks was to sort out various materials in the basement. The result of this assignment can best be expressed in Lashley's own words: 'Among them I found a beautiful Golgi series of the frog brain. I took these to Reese and proposed that I draw all of the connections between the cells. Then we would know how the frog worked (sic!). It was a shock to learn that the Golgi method does not stain all cells, but I think almost ever since I have been trying to trace those connections' (Beach in Orbach, 1982). Only later did Lashley realize that functional variables such as spatial and temporal summation, excitatory and inhibitory states, and micro-movements of elements influencing synaptic contact need not be represented microscopically. The lesson is that neurons are not inert and static, like soldered wires. They are live metabolizing cells with synaptic contacts that vary. If Lashley were alive today, there is no doubt that he would continue to scold modern neuroscientists who still have not become aware of the importance of this fact. 16. Part 2 of the book consists of nine of Lashley's major theoretical papers reprinted in their entirety. These are listed in the References below. Part 2 also includes Lashley's four Vanuxem Lectures given at Princeton University in 1952, and published here for the first time. In these lectures, Lashley referred to the anatomical observations of Lorente de N and emphasized the neural net as the active neural unit in the cerebral cortex. He introduced the idea of a neural priming or presetting, concepts all highly reminiscent of Hebb's theorizing on the central autonomous process and the cell assembly. The term neural lattice was coined by Lashley in 1949. This term was discarded by Hebb in his 1949 monograph in favor of cell assembly. REFERENCES: http://www.wabash.edu/depart/psych/Courses/Psych_81/LASHLEY.HTM http://www.archives.mcgill.ca/guide/volume2/gen01.htm#HEBB, DONALD OLDING http://www.princeton.edu/~harnad/hebb.html http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.amit.html Hebb, D. O. (1949) The Organization of Behavior: a Neuropsychological Theory. New York: Wiley. Hebb, D. O. and Donderi, D.C. (1994) Textbook of Psychology, fourth edition, revised. Dubuque, Iowa: Kendall/Hunt Publishing Company. Hilgard, E. R. and Marquis, D. G. (1940) Conditioning and Learning, NY: Appleton-Century. Lashley, K. S. (1924) 'Studies of cerebral function in learning. VI. The theory that synaptic resistance is reduced by the passage of the nerve impulse.' Psychol. Rev., 31, 369-375. Lashley, K. S. (1931) 'Mass action in cerebral function.' Science 73, 245-254. Lashley, K. S. 'The problem of cerebral organization in vision.' Biol. Symp, 1942, 7, 301-322. Lashley, K. S. (1949) 'Persistent problems in the evolution of mind.' Quart. Rev. Biol., 24, 28-42. Lashley, K. S. (1950) 'In search of the engram.' In Symp. Soc. Exp. Biol. No. 4, Cambridge, Eng.,: Cambridge Univ. Press. Lashley, K. S. (1951) 'The problem of serial order in behavior.' In Jeffress, L. A. (Ed.) Cerebral mechanisms in behavior, New York, Wiley. Lashley, K. S. (1952) Vanuxem Lectures delivered at Princeton University in Feb. 1952. Untitled. Lashley, K. S. (1954) 'Dynamic processes in perception.' In Adrian, E. D. Bremer, F. and Jasper, H. H. (Eds.) Brain Mechanisms and Consciousness. Illinois, Charles C. Thomas, 422-443. Lashley, K. S. (1968) 'Cerebral organization and behavior.' In The Brain and Human Behavior, Proc. Ass. Res. Nerv. Ment. Dis., 36, 1-18. Lashley, K. S. and Wade, M. (1946) 'The Pavlovian theory of generalization.' Psychol. Rev., 53, 72-87. Lashley, K. S., Chow, K.-L, and Semmes, J., (1951) 'An examination of the electrical field theory of cerebral integration.' Psychol. Rev., 58, 123-136. Orbach, J. (1982) Neuropsychology After Lashley: Fifty Years Since the Publication of Brain Mechanisms and Intelligence. Hillsdale, NJ.: Lawrence Erlbaum Associates. Weidman, N. (1996) 'Psychobiology, progressivism, and the anti-progressive tradition.' J. Hist. Biol, 29, 267-308. From jon at syseng.anu.edu.au Mon Sep 20 04:12:32 1999 From: jon at syseng.anu.edu.au (Jonathan Baxter) Date: Mon, 20 Sep 1999 18:12:32 +1000 Subject: New paper on Direct Reinforcement Learning Message-ID: <37E5EC70.90621447@syseng.anu.edu.au> The following paper is available from http://wwwsyseng.anu.edu.au/~jon/papers/drlexp.ps.gz. It is a sequel to http://wwwsyseng.anu.edu.au/~jon/papers/drlalg.ps.gz All comments welcome. Title: Direct Gradient-Based Reinforcement Learning: II. Gradient Ascent Algorithms and Experiments Authors: Jonathan Baxter, Lex Weaver and Peter Bartlett Australian National University Abstract: In \cite{drl1} we introduced \pomdpg, an algorithm for computing arbitrarily accurate approximations to the performance gradient of parameterized partially observable Markov decision processes (\pomdps). The algorithm's chief advantages are that it requires only a single sample path of the underlying Markov chain, it uses only one free parameter $\beta\in [0,1)$ which has a natural interpretation in terms of bias-variance trade-off, and it requires no knowledge of the underlying state. In addition, the algorithm can be applied to infinite state, control and observation spaces. In this paper we present \conjgrad, a conjugate-gradient ascent algorithm that uses \pomdpg\ as a subroutine to estimate the gradient direction. \conjgrad\ uses a novel line-search routine that relies solely on gradient estimates and hence is robust to noise in the performance estimates. \olpomdp, an on-line gradient ascent algorithm based on \pomdpg\ is also presented. The chief theoretical advantage of this gradient based approach over value-function-based approaches to reinforcement learning is that it guarantees improvement in the performance of the policy at {\em every} step. To show that this advantage is real, we give experimental results in which \conjgrad\ was used to optimize a simple three-state Markov chain controlled by a linear function, a two-dimensional ``puck'' controlled by a neural network, a call admission queueing problem, and a variation of the classical ``mountain-car'' task. In all cases the algorithm rapidly found optimal or near-optimal solutions. From berthold at ICSI.Berkeley.EDU Mon Sep 20 18:23:04 1999 From: berthold at ICSI.Berkeley.EDU (Michael Berthold) Date: Mon, 20 Sep 1999 15:23:04 -0700 (PDT) Subject: new book: Intelligent Data Analysis, An Introduction Message-ID: <199909202223.PAA13641@fondue.ICSI.Berkeley.EDU> The following textbook might be of interest to readers of the Connectionist mailing list: "Intelligent Data Analysis: An Introduction" edited by Michael Berthold and David J. Hand (Springer-Verlag, 1999. ISBN 3-540-65808-4) The idea for this book arose when, through teaching classes on IDA and doing consulting work, we realized that there was no coherent textbook to which we could direct students or interested researchers and practitioners in the field. We considered writing such a book ourselves, but abandoned this idea when we realised how wide would be the range of topics which should be covered. Instead, we decided to invite appropriate experts to contribute separate chapters on various fields, and we took pains to ensure that these chapters complemented and built on each other, so that a rounded picture resulted. Our aim was that, rather than focusing on state-of-the-art research, where it is always difficult to tell which ideas will turn out to be really important, each chapter should provide a thorough introduction to its domain. The areas covered are: - Statistical Concepts (Chapter 2) - Statistical Methods (Chapter 3) - Bayesian Methods (Chapter 4) - Analysis of Time Series (Chapter 5) - Rule Induction (Chapter 6) - Neural Networks (Chapter 7) - Fuzzy Logic (Chapter 8) - Stochastic Search Methods (Chapter 9) The book begins with an introduction to the field of intelligent data analysis (Chapter 1) and concludes with a discussion of applications (Chapter 10) and a list of available tools (Appendix A). The table of contents and the preface can be accessed from the Springer (Germany) web-site at: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-65808-4 We hope that the community will find this book useful. From magnus at cs.man.ac.uk Tue Sep 21 10:09:57 1999 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Tue, 21 Sep 1999 15:09:57 +0100 Subject: PhD studentship: Riemannian geometry of neural networks and statistical models Message-ID: <37E791B5.CF5D80A3@cs.man.ac.uk> ------------------------------------------- PhD studentship: Riemannian geometry of neural networks and statistical models ------------------------------------------- Applications are sought for a three year PhD position to study various applications of Riemannian geometry in neural networks and statistical models. The position will be supported by an EPSRC studentship and based in the computer science department at Manchester University, which is one of the largest and most successful computer science departments in the UK. Living expenses will be paid according to current EPSRC rates (19635 pounds over three years) with substantial extra funding available for participation at international conferences and workshops. For more details contact: Magnus Rattray (magnus at cs.man.ac.uk) Computer Science Department, University of Manchester, Manchester M13 9PL, UK. Tel +44 161 275 6187. http://www.cs.man.ac.uk/~magnus/magnus.html Start date: Immediate From mieko at hip.atr.co.jp Wed Sep 22 00:30:09 1999 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Wed, 22 Sep 1999 13:30:09 +0900 Subject: Neural Networks 12(7&8) Message-ID: <199909220437.NAA08352@mailhost.hip.atr.co.jp> NEURAL NETWORKS SPECIAL ISSUE 1999 12(7&8) Contents - Volume 12, Number 7&8 - 1999 ------------------------------------------------------------ ARTICLES: Towards the networks of the brain: from brain imaging to consciousness J.G. Taylor What are the computations of the cerebellum, the basal gangila, and the cerebral cortex? K. Doya Sequence generation in arbitary temporal patterns from theta-nested gamma oscillations: a model of the basal ganglia-thalamo-cortical loops T. Fukai A model of computation in neocortical architecture E. Korner, M.O. Gewaltig, U. Korner, A. Richter, and T. Rodemann Architecture and dynamics of the primate prefrontal cortical circuit for spatial working memory SHOJI Tanaka Computation of pattern invariance in brain-like structures S. Ullman and S. Soloviev Unsupervised visual learning of 3D objects using a modular network architecture H. Ando, S. Suzuki, and T. Fujita Organization of face and object recognition in modular neural network models M.N. Dailey, and G.W. Cottrell On redundancy in neural architecture: dynamics of a simple module-based neural network and initial state independence K. Tsutsumi Complex behavior by means of dynamical systems for an anthropomorphic robot T. Bergener, C. Bruckhoff, P. Dahm, H. Janben, F. Joublin, R. Menzner, A. Steinhage, and W. Von Seelen Generative character of perception: a neural architecture for sensorimotor anticipation H.M. Gross, A. Heinze, T. Seiler, and V. Stephan Learning to perceive the world as articulated: an approach for hierarchical learning in sensory-motor systems J. Tani, and S. Nolfi Adaptive internal state space construction method for reinforcement learning of a real-world agent K. Samejima, and T. Omori Emergence of symbolic behavior from brain like memory with dynamic attention T. Omori, A. Mochizuki, K. Mizutani, and M. Nishizaki Internal models in the control of posture P. Morasso, L. Baratto, R. Capra, and G. Spada Temporally correlated inputs to leaky integrate-and-fire models can reproduce spiking statistics of cortical neurons Y. Sakai, S. Funahashi, and S. Shinomoto The consolidation of learning during sleep: comparing the pseudorehearsal and unlearning accounts A. Robins, and S. Mccallum \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ SPECIAL DISCOUNT for the SPECIAL ISSUE! \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ We set a special discount for the special issue! The ordinary price at US$ 60, we are offering it for US$ 30 (a 50% discount) if you order it before November 15, 1999. The contact addresses for the ordering are the same as given below. \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ______________________________ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professsor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From harnad at coglit.ecs.soton.ac.uk Thu Sep 23 15:23:03 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Thu, 23 Sep 1999 20:23:03 +0100 (BST) Subject: PSYC Call for Commentators: HYPERSTRUCTURE/BRAIN/COGNITION Message-ID: Richardson: HYPERSTRUCTURE IN BRAIN AND COGNITION http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 The target whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email or see websites for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To link to the full text of this article: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc ----------------------------------------------------------------------- psycoloquy.99.10.031.hyperstructure.richardson Thu Sep 23 1999 ISSN 1055-0143 (71 pars, 60 refs, 6 figs, 1 table, 1389 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 1999 Ken Richardson HYPERSTRUCTURE IN BRAIN AND COGNITION Target Article on Hyperstructure Ken Richardson Centre for Human Development & Learning The Open University Walton Hall Milton Keynes MK7 6AA United Kingdom k.richardson at open.ac.uk ABSTRACT: This target article tries to identify the informational content of experience underlying object percepts and concepts in complex, changeable environments, in a way which can be related to higher cerebral functions. In complex environments, repetitive experience of feature- and object-images in static, canonical form is rare, and this remains a problem in current theories of conceptual representation. The only reliable information available in natural experience consists of nested covariations or 'hyperstructures'. These need to be registered in a representational system. Such representational hyperstructures can have novel emergent structures and evolution into 'higher' forms of representation, such as object concepts and event- and social-schemas. Together, these can provide high levels of predictability. A sketch of a model of hyperstructural functions in object perception and conception is presented. Some comparisons with related views in the literature of the recent decades are made, and some empirical evidence is briefly reviewed. KEYWORDS: complexity, covariation, features, hypernetwork, hyperstructure, object concepts, receptive field, representation http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.031 From shirish at csa.iisc.ernet.in Mon Sep 27 04:05:45 1999 From: shirish at csa.iisc.ernet.in (Shirish K. Shevade) Date: Mon, 27 Sep 1999 13:35:45 +0530 (IST) Subject: TR Announcement Message-ID: Technical Report Announcement: Smola and Sch\"{o}lkopf's SMO algorithm for SVM regression is very simple and easy to implement. In a recent paper we suggested some improvements to Platt's SMO algorithm for SVM classifier design. In this report we extend those ideas to Smola and Sch\"{o}lkopf's SMO algorithm for regression. The resulting modified algorithms run much faster than the original SMO. Details are given in the Technical Report mentioned below. A gzipped postscript file containing the report can be downloaded from: http://guppy.mpe.nus.edu.sg/~mpessk/ Send any comments to: shirish at csa.iisc.ernet.in ---------------------------------------------------------------------------- Improvements to SMO Algorithm for SVM Regression Technical Report CD-99-16 S.K. Shevade, S.S. Keerthi, C. Bhattacharyya & K.R.K. Murthy Abstract This paper points out an important source of confusion and inefficiency in Smola and Sch\"{o}lkopf's Sequential Minimal Optimization (SMO) algorithm for regression that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO. These modified algorithms perform significantly faster than the original SMO on the datasets tried. ---------------------------------------------------------------------------- From simone at eealab.unian.it Mon Sep 27 02:18:11 1999 From: simone at eealab.unian.it (Simone G.O. Fiori) Date: Mon, 27 Sep 1999 08:18:11 +0200 Subject: Papers available on SOM and BSS-ICA Message-ID: <1.5.4.32.19990927061811.0067c17c@prometeo.eealab.unian.it> Dear Connectionists, the following two papers are now available: "A Review of Artificial Neural Networks Applications in Microwave Computer-Aided Design" by Pietro Burrascano, Simone Fiori, and Mauro Mongiardo University of Perugia, Perugia - Italy Abstract Neural networks found significant applications in microwave CAD. In this paper, after providing a brief description of neural networks employed so far in this context, we illustrate some of their most significant applications and typical issues arising in practical implementation. We also summarize current research tendencies and introduce use of self-organizing maps (SOM) enhancing model accuracy and applicability. We conclude considering some future developments and exciting perspectives opened from use of neural networks in microwave CAD. Keywords Artificial neural networks; Self-organizing maps; Microwave components; Filter design. Journal International Journal of RF and Microwave CAE, Vol. 9, pp. 158 -- 174, 1999 ============================================================== "Entropy Optimization by the PFANN Network: Application to Blind Source Separation" by Simone Fiori University of Perugia, Perugia - Italy Abstract The aim of this paper is to present a study of polynomial functional-link neural units that learn through an information- theoretic-based criterion. First the structure of the neuron is presented and the unsupervised learning theory is explained and discussed, with particular attention being paid to its probability density function and cimulative distribution function approximation capability. Then a neural network formed by such neurons (the polynomial functional-link artificial neural network, or PFANN) is shown to be able to separate out lienarly mixed eterokurtic source signals, i.e. signals endowed with either positive or negative kurtoses. In order to compare the performance of the proposed blind separation technique with those exhibited by existing methods, the mixture of densities (MOD) approach of Xu et al, which is closely related to PFANN, is briefly recalled; then comparative numerical simulations performed on both synthetic and real-world signals and a complexity evaluation are illustrated. These results show that the PFANN approach give similar performance with a noticeable reduction in computational effort. Journal Network: Computation in Neural Systems, Vol. 10, No. 2, pp. 171 -- 186, 1999 Requests of reprints should be addressed to: Dr. Simone Fiori Neural Networks Research Group at the Dept. of Industrial Engineering University of Perugia - Perugia, Italy Loc. Pentima bassa, 21 I-05100, TERNI E-mail: simone at eealab.unian.it, sfr at unipg.it Best regards, Simone From mike at deathstar.psych.ualberta.ca Mon Sep 27 15:28:38 1999 From: mike at deathstar.psych.ualberta.ca (Michael R.W. Dawson) Date: Mon, 27 Sep 1999 13:28:38 -0600 (MDT) Subject: Jobs at U.ofA. Message-ID: DEPARTMENT OF PSYCHOLOGY, UNIVERSITY OF ALBERTA Two positions in Computational Psychology / Computational Neuroscience or in Cognitive Engineering The Department of Psychology at the University of Alberta, is seeking to expand its program in Computational Psychology and Cognitive Engineering. Two tenure-track positions at the Assistant Professor level in Computational Psychology / Computational Neuroscience or in Cognitive Engineering will be open to competition. Appointments will be effective July 1, 2000. Candidates in Computational Psychology / Computational Neuroscience should have a strong interest in modeling and predicting human behavior, or in modeling of brain functions at the level of neurons, neuronal groups or large brain subsystems using formal approaches such as mathematical modeling, neural networks, evolutionary computing, or computer simulations. Candidates in Cognitive Engineering should have a strong interest in interaction of humans with computers, machines or complex environments, in decision-support systems in industrial, medical or emergency situations, and in the design of computer-based tools to support and enhance performance of humans in these situations. The expectation is that the successful candidates will secure competitive research funds and/or industrial support. Hiring decisions will be made on the basis of demonstrated research capability, teaching ability, potential for interactions with colleagues and fit with departmental needs. The applicant should send a curriculum vitae, a statement of current and future research plans, recent publications, and arrange to have at least three letters of reference forwarded, to: Dr Terry Caelli, Chair, Department of Psychology P220 Biological Sciences Building University of Alberta Edmonton, Alberta Canada T6G 2E9. Closing date for applications is December 1, 1999. Further information on these positions can be obtained from http://web.psych.ualberta.ca/hiring. In accordance with Canadian Immigration requirements, this advertisement is directed to Canadian Citizens and permanent residents. If suitable Canadian citizens and permanent residents cannot be found, other individuals will be considered. The University of Alberta is committed to the principle of equity in employment. As an employer we welcome diversity in the workplace and encourage applications from all qualified women and men, including Aboriginal peoples, persons with disabilities, and members of visible minorities. _____________________________________________________ Background Computational Psychology / Computational Neuroscience Computational Psychology is concerned with the generation of formal representations and algorithms for modeling, predicting and improving human behavior. Computational Neuroscience, on the other hand, is concerned with modeling brain functions different levels, at the level of single neurons, at the level of neuronal groups, and at the level of brain subsystems. Both, Cognitive Psychology and Cognitive Neuroscience, rely on a wealth of formal approaches: mathematical modeling, neural networks, evolutionary computing, computer simulations, uncertainty calculi, HMMs, etc. A hiring in this area not only strengthens the expertise in our program, but also helps to increase collaborative ties between programs (in particular with BCN) and between departments (in particular with Neuroscience and with Computing Science). Cognitive Engineering is concerned with the interaction of humans with complex environments, such as the interaction of humans with computers, machines or complex (typically industrial) environments, with decision-support systems in industrial, medical or emergency situations, with the design of computer-based tools to support and enhance performance of humans in these situations, and with methods to efficiently train humans for these situations. Cognitive Engineering relies on a variety of methods and tools, including performance assessment, spatial information systems and methods for developing computer support technologies (e.g. expert systems, uncertainty, machine learning). Cognitive Engineering has close links to Human Factors and industrial applications. -- Professor Michael R.W. Dawson | mike at bcp.psych.ualberta.ca | (780)-492-5175 Biological Computation Project, Dept. of Psychology, University of Alberta Edmonton, AB, CANADA T6G 2E9 | http://www.bcp.psych.ualberta.ca/~mike/ From sutton at research.att.com Tue Sep 28 14:18:49 1999 From: sutton at research.att.com (Rich Sutton) Date: Tue, 28 Sep 1999 14:18:49 -0400 Subject: two papers on reinforcement learning Message-ID: This is to announce the availability of two papers on reinforcement learning. -------------------------------------------------------------------------------- Policy Gradient Methods for Reinforcement Learning with Function Approximation Richard S. Sutton, David McAllester, Satinder Singh, and Yishay Mansour Accepted for presentation at NIPS'99 Function approximation is essential to reinforcement learning, but the standard approach of approximating a value function and determining a policy from it has so far proven theoretically intractable. In this paper we explore an alternative approach in which the policy is explicitly represented by its own function approximator, independent of the value function, and is updated according to the gradient of expected reward with respect to the policy parameters. Williams's REINFORCE method and actor--critic methods are examples of this approach. Our main new result is to show that the gradient can be written in a form suitable for estimation from experience aided by an approximate action-value or advantage function. Using this result, we prove for the first time that a version of policy iteration with arbitrary differentiable function approximation is convergent to a locally optimal policy. ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SMSM-NIPS99-submitted.ps.gz or ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SMSM-NIPS99-submitted.pdf -------------------------------------------------------------------------------- -------------------------------------------------------------------------------- Between MDPs and Semi-MDPs: A Framework for Temporal Abstraction in Reinforcement Learning Richard S. Sutton, Doina Precup, and Satinder Singh Accepted for publication in Artificial Intelligence (a revised version of our earlier technical report on this topic) Learning, planning, and representing knowledge at multiple levels of temporal abstraction are key, longstanding challenges for AI. In this paper we consider how these challenges can be addressed within the mathematical framework of reinforcement learning and Markov decision processes (MDPs). We extend the usual notion of action in this framework to include {\it options\/}---closed-loop policies for taking action over a period of time. Examples of options include picking up an object, going to lunch, and traveling to a distant city, as well as primitive actions such as muscle twitches and joint torques. Overall, we show that options enable temporally abstract knowledge and action to be included in the reinforcement learning framework in a natural and general way. In particular, we show that options may be used interchangeably with primitive actions in planning methods such as dynamic programming and in learning methods such as Q-learning. Formally, a set of options defined over an MDP constitutes a semi-Markov decision process (SMDP), and the theory of SMDPs provides the foundation for the theory of options. However, the most interesting issues concern the interplay between the underlying MDP and the SMDP and are thus beyond SMDP theory. We present results for three such cases: 1) we show that the results of planning with options can be used during execution to interrupt options and thereby perform even better than planned, 2) we introduce new {\it intra-option\/} methods that are able to learn about an option from fragments of its execution, and 3) we propose a notion of subgoal that can be used to improve the options themselves. All of these results have precursors in the existing literature; the contribution of this paper is to establish them in a simpler and more general setting with fewer changes to the existing reinforcement learning framework. In particular, we show that these results can be obtained without committing to (or ruling out) any particular approach to state abstraction, hierarchy, function approximation, or the macro-utility problem. ftp://ftp.cs.umass.edu/pub/anw/pub/sutton/SPS-aij.ps.gz -------------------------------------------------------------------------------- From robbie at bcs.rochester.edu Tue Sep 28 09:01:50 1999 From: robbie at bcs.rochester.edu (Robbie Jacobs) Date: Tue, 28 Sep 1999 09:01:50 -0400 (EDT) Subject: visual cue combination articles Message-ID: <199909281301.JAA08097@broca.bcs.rochester.edu> The following two articles are published in the journal Vision Research, but may be of interest to readers of this list: (1) Jacobs, R.A. (1999) Optimal integration of texture and motion cues to depth. Vision Research, 39, 3621-3629. (2) Jacobs, R.A. and Fine, I. (1999) Experience-dependent integration of texture and motion cues to depth. Vision Research, 39, 4062-4075. ========================================= (1) Jacobs, R.A. (1999) Optimal integration of texture and motion cues to depth. Vision Research, 39, 3621-3629. We report the results of a depth-matching experiment in which subjects were asked to adjust the height of an ellipse until it matched the depth of a simulated cylinder defined by texture and motion cues. On one-third of the trials the shape of the cylinder was primarily given by motion information, on one-third of the trials it was given by texture information, and on the remaining trials it was given by both sources of information. Two optimal cue combination models are described where optimality is defined in terms of Bayesian statistics. The parameter values of the models are set based on subjects' responses on trials when either the motion cue or the texture cue was informative. These models provide predictions of subjects' responses on trials when both cues were informative. The results indicate that one of the optimal models provides a good fit to the subjects' data, and the second model provides an exceptional fit. Because the predictions of the optimal models closely match the experimental data, we conclude that observers' cue combination strategies are indeed optimal, at least under the conditions studied here. Available on the web at: www.bcs.rochester.edu/bcs/people/faculty/robbie/jacobs.vr99.ps.Z ========================================= (2) Jacobs, R.A. and Fine, I. (1999) Experience-dependent integration of texture and motion cues to depth. Vision Research, 39, 4062-4075. Previous investigators have shown that observers' visual cue combination strategies are remarkably flexible in the sense that these strategies adapt on the basis of the estimated reliabilities of the visual cues. However, these researchers have not addressed how observers acquire these estimated reliabilities. This article studies observers' abilities to learn cue combination strategies. Subjects made depth judgments about simulated cylinders whose shapes were indicated by motion and texture cues. Because the two cues could indicate different shapes, it was possible to design tasks in which one cue provided useful information for making depth judgments, whereas the other cue was irrelevant. The results of Experiment One suggest that observers' cue combination strategies are adaptable as a function of training; subjects adjusted their cue combination rules to use a cue more heavily when the cue was informative on a task versus when the cue was irrelevant. Experiment Two demonstrated that experience-dependent adaptation of cue combination rules is context-sensitive. On trials with presentations of short cylinders, one cue was informative, whereas on trials with presentations of tall cylinders, the other cue was informative. The results suggest that observers can learn multiple cue combination rules, and can learn to apply each rule in the appropriate context. Experiment Three demonstrated a possible limitation on the context-sensitivity of adaptation of cue combination rules. One cue was informative on trials with presentations of cylinders at a left oblique orientation, whereas the other cue was informative on trials with presentations of cylinders at a right oblique orientation. The results indicate that observers did not learn to use different cue combination rules in different contexts under these circumstances. These results are consistent with the hypothesis that observers' visual systems are biased to learn to perceive in the same way views of bilaterally symmetric objects that differ solely by a symmetry transformation. Taken in conjunction with the results of Experiment Two, this means that the visual learning mechanism underlying cue combination adaptation is biased such that some sets of statistics are more easily learned than others. Available on the web at: www.bcs.rochester.edu/bcs/people/faculty/robbie/jacobsfine.vr99.ps.Z From RK at hirn.uni-duesseldorf.de Fri Sep 24 11:56:07 1999 From: RK at hirn.uni-duesseldorf.de (Rolf Kotter) Date: Fri, 24 Sep 1999 17:56:07 +0200 Subject: PTRS - call for submissions Message-ID: <37EB9F17.5C02383@hirn.uni-duesseldorf.de> ============================================================================ CALL FOR SUBMISSIONS Theme issue of Philosophical Transactions: Biological Sciences http://www.pubs.royalsoc.ac.uk/publish/phi_bs/ Theme: NEUROSCIENCE DATABASES - tools for exploring structure-function relationships in the brain Theme editor: Rolf Ktter Understanding the workings of systems as complex as the nervous system requires the aid of computational tools to collate, analyse and test experimental data with the aim of establishing close structure-function relationships. Over the last years many attempts have been made to construct neuroscience databases collating data about structures, small functional circuits and global functions of the nervous system. The aim of this Theme Issue is to critically review the achievements and problems of previous and current approaches, to devise future strategies and to make these insights available to everyone concerned with neuroscience databases. More specifically, papers are expected to cover one or more of the following topics: adequate representations of different types of neuroscientific data; identification of promising research fields versus problem data; arguments for representation of individual vs. summary, and raw vs. interpreted data. tools for meta-analysis of data in neuroscience databases and methods (statistical, computational, etc.) for establishing structure-function relationships in the brain. quality control of database contents: access control, peer review, links to publications strategies to improve the contents, user-friendliness, acceptance, significance and longevity of databases; desirable developments in other fields, e.g. data visualisation. lessons to be learnt from databases in fields beyond neuroscience (e.g. gene sequences, protein structure, images, references; see Human Genome Project or U.S. National Center for Biotechnology Information). technical and organisational issues of design, programming (reusable code?), maintenance, updating and cross-linking of databases and hardware platforms; support teams, financial requirements, life-cycles of databases. impact of databases on the neuroscience communities; relationship between experimentalists (data producers), data collators and data analysts: who wants and who needs databases, and how do databases affect the production and publication of data? These topics are often best addressed within the context of specific database projects, but note that it is not sufficient to simply present your database project. Finally, although the contents of the contributed papers can be quite specialised, their concept, thrust and significance should be intelligible to interested non-specialists. SCHEDULE All submissions will be subject to a rigorous review process. The Theme Issue will contain 10-15 refereed papers, which are going to be invited on the basis of abstract submissions. 1 November 1999 Submission of an abstract and a tentative title declaring intention to submit a full paper by the deadline given below. This should be done by e-mail to RK at hirn.uni-duesseldorf.de After selection process and invitation of full papers: 31 May 2000 Deadline for receipt of full paper in three copies. After referees' comments and (if necessary) revisions: 31 December 2000 Finalisation of all papers for publication of theme issue in 2001. ADDRESS FOR SUBMISSIONS AND CORRESPONDENCE Dr. Rolf Kotter C. & O. Vogt Brain Research Institute, Bldg. 22.03 Heinrich Heine University, Universitatsstr. 1 D-40225 Dusseldorf, Germany phone + fax: +49-211-81-12095 e-mail: RK at hirn.uni-duesseldorf.de http://www.hirn.uni-duesseldorf.de/~rk ============================================================================ From abla at gatsby.ucl.ac.uk Mon Sep 27 07:10:36 1999 From: abla at gatsby.ucl.ac.uk (abla@gatsby.ucl.ac.uk) Date: Mon, 27 Sep 1999 12:10:36 +0100 Subject: New Data Fusion MSc Message-ID: <3.0.6.32.19990927121036.00942d70@axon.gatsby.ucl.ac.uk> To all participants of the Gatsby Neural Computation Tutorial: Given your interest in neural networks, you (or your colleagues) might be interested to know about the new MSc in data fusion (a technology which makes use of neural networks). This was announced at the recent FUSION'99 conference held in Sunnyvale, USA . It is the first such course in the world and generated great interest in the international data fusion community. This much-needed course is being run by the School of Computing at the University of Central England in Birmingham. The course lasts for one year starting in January 2000 and is aimed at allowing people in employment to continue to work with minimal disruption. The first part of the course comprises taught modules delivered either during an intensive week or on a 1-day a week basis over the semester. The course is designed in such a way that anybody may attend the one-week "Introduction to Data Fusion" module without taking the full MSc.) The second part of the course involves carrying out an appropriate research project culminating in a dissertation. It is hoped that many students will identify a practical data fusion problem within their own company on which they can work to the benefit of both themselves and their company. Further details can be found on the internet at http://www.cis.uce.ac.uk/faculty/comput/courses/msc_DFroute.htm or by contacting John Perkins, MSc Course Director by email at john.perkins at uce.ac.uk or by phone on 0121 331 6209. If you want some informal information please email me at jane.obrien at datafusion.clara.co.uk. Jane O'Brien Visiting Fellow of the Faculty of Computing, Information and Computing University of Central England From espaa at exeter.ac.uk Mon Sep 27 10:51:47 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Mon, 27 Sep 1999 15:51:47 +0100 (GMT Daylight Time) Subject: PAA issue 2(3) contents Message-ID: Pattern Analysis & Applications Springer Verlag Ltd. http://www.dcs.exeter.ac.uk/paa (JOURNAL WEBSITE) http://link.springer.de/link/service/journals/10044/index.htm (SPRINGER ELECTRONIC SERVICE) ISSN: 1433-7541 (printed version) ISSN: 1433-755X (electronic version) Table of Contents Vol. 2 Issue 3 L. P. Cordella, P. Foggia, C. Sansone, F. Tortorella, M. Vento: Reliability Parameters to Improve Combination Strategies in Multi-Expert Systems Pattern Analysis & Applications 2 (1999) 3, 205-214 P. Foggia, C. Sansone, F. Tortorella, M. Vento: Definition and Validation of a Distance Measure Between Structural Primitives Pattern Analysis & Applications 2 (1999) 3, 215-227 Z. Lou, K. Liu, J. Y. Yang, C. Y. Suen: Rejection Criteria and Pairwise Discrimination of Handwritten Numerals Based on Structural Features Pattern Analysis & Applications 2 (1999) 3, 228-238 J. Y. Goulermas, P. Liatsis: Incorporating Gradient Estimations in a Circle-Finding Probabilistic Hough Transform Pattern Analysis & Applications 2 (1999) 3, 239-250 J. G. Keller, S. K. Rogers, M. Kabrisky, M. E. Oxley: Object Recognition Based on Human Saccadic Behaviour Pattern Analysis & Applications 2 (1999) 3, 251-263 __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From cindy at cns.bu.edu Tue Sep 28 10:51:47 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Tue, 28 Sep 1999 10:51:47 -0400 Subject: call for papers: ICCNS 2000 Message-ID: <199909281451.KAA16678@retina.bu.edu> ***** CALL FOR PAPERS ***** FOURTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 24, 2000 Meeting: May 25-27, 2000 Boston University 677 Beacon Street Boston, Massachusetts 02215 http://cns-web.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems This interdisciplinary conference has drawn about 300 people from around the world each time that it has been offered. Last year's conference was attended by scientists from 30 countries. The conference is structured to facilitate intense communication between its participants, both in the formal sessions and during its other activities. As during previous years, the millennium conference will focus on solutions to the fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * cognition, planning, and attention * other Contributed abstracts must be received, in English, by January 28, 2000. Notification of acceptance will be provided by email by February 29, 2000. A meeting registration fee of $50 for regular attendees and $35 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 14, 2000. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and a first and second choice from among the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows may be available to cover meeting travel and living costs. This will be confirmed one way or the other, and broadly advertised if confirmed, before the deadline to apply for fellowship support, which will be January 28, 2000. Applicants will be notified by email by February 29, 2000. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Fourth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 24, 2000 Meeting: May 25-27, 2000 FAX: (617) 353-7755 http://cns-web.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $75 Conference plus Tutorial (Regular) ( ) $50 Conference plus Tutorial (Student) ( ) $50 Conference Only (Regular) ( ) $35 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From S.Singh at exeter.ac.uk Wed Sep 29 06:14:48 1999 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Wed, 29 Sep 1999 11:14:48 +0100 (GMT Daylight Time) Subject: MPhil in CS (Financial Forecasting using Neural Networks) Message-ID: MPhil in Computer Science UNIVERSITY OF EXETER SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science Applications are now invited for an MPhil studentship in the area of "Financial Forecasting using Neural Networks". The project will develop student skills in areas including neural networks, financial forecasting, and pattern recognition. The studentship is in collaboration with Siebe Appliance Controls Ltd at Plymouth. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of Unix operating system. The studentships cover UK/EEC fees and maintenance over two years. The successful candidates should expect to take up the studentships no later than 1 November, 1999. Applicants should send a CV, including the names and addresses of two referees, to Dr Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (s.singh at exeter.ac.uk). Applicants should ask their referees to directly send their references to the above address. Informal enquiries can be made at +44-1392-264053. -------------------------------------------- Sameer Singh Director, PANN Research Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer -------------------------------------------- From nat at cs.dal.ca Wed Sep 29 11:28:51 1999 From: nat at cs.dal.ca (Nathalie Japkowicz) Date: Wed, 29 Sep 1999 12:28:51 -0300 (ADT) Subject: Tesis + Papers Announcement Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. Dissertation and of a few related papers. Regards, Nathalie. ----------------------------------------------------------------------- Thesis: ------- Title: "Concept-Learning in the Absence of Counter-Examples: An Autoassociation-Based Approach to Classification" Advisors: Stephen Jose Hanson and Casimir A. Kulikowski URL: http://borg.cs.dal.ca/~nat/Research/thesis.ps.gz Abstract: -------- The overwhelming majority of research currently pursued within the framework of concept-learning concentrates on discrimination-based learning. Nevertheless, this emphasis can present a practical problem: there are real-world engineering problems for which counter-examples are both scarce and difficult to gather. For these problems, recognition-based learning systems are much more appropriate because they do not use counter-examples in the concept-learning phase and thus require fewer counter-examples altogether. The purpose of this dissertation is to analyze a promising connectionist recognition-based learning system--- autoassociation-based classification---and answer the following questions raised by a preliminary comparison of the autoassociator and its discrimination counterpart, the Multi-Layer Perceptron (MLP), on three real-world domains: * What features of the autoassociator make it capable of performing classification in the absence of counter-examples? * What causes the autoassociator to be significantly more efficient than MLP in certain domains? * What domain characteristics cause the autoassociator to be more accurate than MLP and MLP to be more accurate than the autoassociator? A study of the two systems in the context of these questions yields the conclusions that 1) Autoassociation-based classification is possible in a particular class of practical domains called non-linear and multi-modal because the autoassociator uses a multi-modal specialization bias to compensate for the absence of counter-examples. This bias can be controlled by varying the capacity of the autoassociator. 2) The difference in efficiency between the autoassociator and MLP observed on this class of domains is caused by the fact that the autoassociator uses a (fast) bottom-up generalization strategy whereas MLP has recourse to a (slow) top-down one, despite the fact that the two systems are both trained by the backpropagation procedure. 3) The autoassociator classifies more accurately than MLP domains requiring particularly strong specialization biases caused by the counter-conceptual class or particularly weak specialization biases caused by the conceptual class. However, MLP is more accurate than the autoassociator on domains requiring particularly strong specialization biases caused by the conceptual class. The results of this study thus suggest that recognition-based learning, which is often dismissed in favor of discrimination-based ones in the context of concept-learning, may present an interesting array of classification strengths. ------------------------------------------------------------------------ Related Papers: --------------- * "Nonlinear Autoassociation is not Equivalent to PCA" , Japkowicz, N., Hanson S.J., and Gluck, M.A. in Neural Computation (in press). Abstract: --------- A common misperception within the Neural Network community is that even with nonlinearities in their hidden layer, autoassociators trained with Backpropagation are equivalent to linear methods such as Principal Component Analysis (PCA). The purpose of this paper is to demonstrate that nonlinear autoassociators actually behave differently from linear methods and that they can outperform these methods when used for latent extraction, projection and classification. While linear autoassociators emulate PCA and thus exhibit a flat or unimodal reconstruction error surface, autoassociators with nonlinearities in their hidden layer learn domains by building error reconstruction surfaces that, depending on the task, contain multiple local valleys. This particular interpolation bias allows nonlinear autoassociators to represent appropriate classifications of nonlinear multi-modal domains, in contrast to linear autoassociators which are inappropriate for such tasks. In fact, autoassociators with hidden unit nonlinearities can be shown to perform nonlinear classification and nonlinear recognition. URL: http://borg.cs.dal.ca/~nat/Papers/neuralcomp.ps.gz * "Adaptability of the Backpropagation Procedure" , Japkowicz, N. and Hanson S.J., in the proceedings of the 1999 International Joint Conference in Neural Networks (IJCNN-99) . Abstract: --------- Possible paradigms for concept learning by feedforward neural networks include discrimination and recognition. An interesting aspect of this dichotomy is that the recognition-based implementation can learn certain domains much more efficiently than the discrimination-based one, despite the close structural relationship between the two systems. The purpose of this paper is to explain this difference in efficiency. We suggest that it is caused by a difference in the generalization strategy adopted by the Backpropagation procedure in both cases: while the autoassociator uses a (fast) bottom-up strategy, MLP has recourse to a (slow) top-down one, despite the fact that the two systems are both optimized by the Backpropagation procedure. This result is important because it sheds some light on the nature of Backpropagation's adaptative capability. From a practical viewpoint, it suggests a deterministic way to increase the efficiency of Backpropagation-trained feedforward networks. URL: http://borg.cs.dal.ca/~nat/Papers/ijcnn-5.ps.gz * "Are we Better off without Counter Examples" , Japkowicz, N., in the proceedings of the 1999 conference on Advances in Intelligent Data Analysis (AIDA-99). Abstract: --------- Concept-learning is commonly implemented using discrimination-based techniques which rely on both examples and counter-examples of the concept. Recently, however, a recognition-based approach that learns a concept in the absence of counter-examples was shown to be more accurate than its discrimination counterpart on two real-world domains and as accurate on the third. The purpose of this paper is to find out whether this recognition- based approach is generally more accurate than its discrimination counterpart or whether the results it obtained previously are purely coincidental. The analysis conducted in this paper concludes that the results obtained on the real-world domains were not coincidental, and this suggests that recognition-based approaches are promising techniques worth studying in greater depth. URL: http://borg.cs.dal.ca/~nat/Papers/accuracy.ps.gz * "A Novelty Detection Approach to Classification" , Japkowicz, N., Myers, C. & Gluck, M., in the proceedings of the Fourteenth International Joint Conference on Artificial Intelligence (IJCAI-95). pp. 518-523. Abstract: --------- Novelty Detection techniques are concept-learning methods that proceed by recognizing positive instances of a concept rather than differentiating between its positive and negative instances. Novelty Detection approaches consequently require very few, if any, negative training instances. This paper presents a particular Novelty Detection approach to classification that uses a Redundancy Compression and Non-Redundancy Differentiation technique based on the Gluck & Myers model of the hippocampus, a part of the brain critically involved in learning and memory. In particular, this approach consists of training an autoencoder to reconstruct positive input instances at the output layer and then using this autoencoder to recognize novel instances. Classification is possible, after training, because positive instances are expected to be reconstructed accurately while negative instances are not. The purpose of this paper is to compare HIPPO, the system that implements this technique, to C4.5 and feedforward neural network classification on several applications. URL: http://borg.cs.dal.ca/~nat/Papers/ijcai95_final.ps.gz -- Nathalie Japkowicz, Ph.D. Assistant Professor Faculty of Computer Science DalTech/Dalhousie University 6050 University Avenue Halifax, Nova Scotia Canada, B3H 1W5 e-mail: nat at cs.dal.ca Homepage: http://borg.cs.dal.ca/~nat From schubert at sto.foa.se Thu Sep 30 03:27:15 1999 From: schubert at sto.foa.se (Johan Schubert) Date: Thu, 30 Sep 1999 09:27:15 +0200 Subject: On web: Clustering Belief Functions (Dempster-Shafer Theory) Message-ID: <990930092719.ZM24839@atlas.sto.foa.se> Clustering Belief Functions (Dempster-Shafer Theory) ---------------------------------------------------- My papers on clustering belief functions, etc., are now available on the web with URL: http://www.foa.se/fusion/ Publications Schubert, J., Simultaneous Dempster-Shafer clustering and gradual determination of number of clusters using a neural network structure. In Proceedings of the 1999 Information, Decision and Control Conference (IDC'99), Adelaide, Australia, 8-10 February 1999. IEEE, Piscataway, 1999, pp. 401-406. Schubert, J., A neural network and iterative optimization hybrid for Dempster-Shafer clustering. In Proceedings of EuroFusion98 International Conference on Data Fusion (EF'98), M. Bedworth, J. O'Brien (Eds.), Great Malvern, UK, 6-7 October 1998, pp. 29-36. Schubert, J., Fast Dempster-Shafer clustering using a neural network structure. In Proceedings of the Seventh International Conference on Information Processing and Management of Uncertainty in Knowledge-based Systems (IPMU'98), Universit? de La Sorbonne, Paris, France, 6-10 July 1998. Editions EDK, Paris, 1998, pp. 1438-1445. Bergsten, U., Schubert, J. and Svensson, P., Applying Data Mining and Machine Learning Techniques to Submarine Intelligence Analysis. In Proceedings of the Third International Conference on Knowledge Discovery and Data Mining (KDD'97), D. Heckerman, H. Mannila, D. Pregibon, R. Uthurusamy (Eds.), Newport Beach, USA, 14-17 August 1997. The AAAI Press, Menlo Park, pp. 127-130. Schubert, J., Creating Prototypes for Fast Classification in Dempster-Shafer Clustering. In Qualitative and Quantitative Practical Reasoning, D. M. Gabbay, R. Kruse, A. Nonnengart, H. J. Ohlbach (Eds.), Proceedings of the First International Joint Conference on Qualitative and Quantitative Practical Reasoning (ECSQARU-FAPR'97), Bad Honnef, Germany, 9-12 June 1997. Springer-Verlag (LNAI 1244), Berlin, 1997, pp. 525-535. Schubert, J., Specifying nonspecific evidence. International Journal of Intelligent Systems 11(8), 525-563, 1996. Schubert, J., On Rho in a Decision-Theoretic Apparatus of Dempster-Shafer Theory. International Journal of Approximate Reasoning 13(3), 185-200, 1995. (FOA-B--95-00097-3.4--SE, Defence Research Establishment, 1995) Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory for an Evidential Intelligence Analysis of MultipleTarget Tracks (Thesis Abstract). AI Communications 8(2) (1995) 107-110. Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory. In Symbolic and Quantitative Approaches to Reasoning and Uncertainty, C. Froidevaux and J. Kohlas (Eds.), Proceedings of the European Conference on Symbolic and Quantitative Approaches to Reasoning and Uncertainty (ECSQARU'95), Universit? de Fribourg, Switzerland, 3-5 July 1995. Springer-Verlag (LNAI 946), Berlin, 1995, pp. 395-404. Schubert, J., Finding a Posterior Domain Probability Distribution by Specifying Nonspecific Evidence. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 3(2) (1995) 163-185. Schubert, J., Cluster-based Specification Techniques in Dempster-Shafer Theory for an Evidential Intelligence Analysis of Multiple Target Tracks, Ph.D. Thesis, TRITA-NA-9410, ISRN KTH/NA/R--94/10--SE, ISSN 0348-2952, ISBN 91-7170-801-4. Royal Institute of Technology, Sweden, 1994. Bergsten, U. and Schubert, J., Dempster's Rule for Evidence Ordered in a Complete Directed Acyclic Graph. International Journal of Approximate Reasoning 9(1) (1993) 37-73. Schubert, J., On Nonspecific Evidence. International Journal of Intelligent Systems 8(6) (1993) 711-725. All papers are available as postscript files, most are also available as pdf files [except for my 1994 Ph.D. thesis which is only available in hard copy by post upon request (no charge): schubert at sto.foa.se]. Sincerely, Johan Schubert Department of Data and Information Fusion Defence Research Establishment, Sweden E-mail: schubert at sto.foa.se From bogus@does.not.exist.com Thu Sep 30 09:10:53 1999 From: bogus@does.not.exist.com () Date: Thu, 30 Sep 1999 15:10:53 +0200 Subject: CFP: ESANN'2000 European Symposium on Artificial Neural Networks Message-ID: ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | First announcement and call for papers | ---------------------------------------------------- Technically co-sponsored by the IEEE Neural Networks Council, the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. The call for papers for the ESANN'2000 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2000 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2000 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical aspects, approximation of functions, classification, control, time-series prediction, statistics, signal processing, vision, self-organization, vector quantization, evolutive learning, psychological computations, biological plausibility, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are also encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: o theory o models and architectures o mathematics o learning algorithms o vector quantization o self-organization o RBF networks o Bayesian classification o recurrent networks o support vector machines o time series forecasting o adaptive control o statistical data analysis o independent component analysis o signal processing o approximation of functions o cellular neural networks o fuzzy neural networks o natural and artificial vision o hybrid networks o identification of non-linear dynamic systems o biologically plausible artificial networks o bio-inspired systems o neurobiological systems o cognitive psychology o adaptive behaviour o evolutive learning Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. o Self-organizing maps for data analysis J. Lampinen, K. Kaski, Helsinki Univ. of Tech. (Finland) o Time-series prediction J. Suykens, J. Vandewalle, K.U. Leuven (Belgium) o Artificial neural networks and robotics R. Duro, J. Santos Reyes, Univ. da Coruna (Spain) o Support Vector Machines C. Campbell, Bristol Univ. (UK), J. Suykens, K.U. Leuven (Belgium) o Neural networks and statistics W. Duch, Nicholas Copernicus Univ. (Poland) o Neural network in medicine T. Villmann, Univ. Leipzig (Germany) o Artificial neural networks for energy management systems G. Joya, Univ. de Malaga (Spain) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in an hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference, or in another one (50 m. from the first one) at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Call for contributions ---------------------- Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee ---------------------------- Fran?ois Blayo Pr?figure (F) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Henri Leich Fac. Polytech. Mons (B) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Edoardo Amaldi Politecnico di Milano (I) Agn?s Babloyantz Univ. Libre Bruxelles (B) Herv? Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt Univ. Lille II & UCL Louv.-la-N. (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Dresden Univ. of Technology (D) Stan Gielen Univ. of Nijmegen (NL) Manuel Grana UPV San Sebastian (E) Anne Gu?rin-Dugu? INPG Grenoble (F) Martin Hasler EPFL Lausanne (CH) Laurent H?rault CEA-LETI Grenoble (F) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinky Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Eddy Mayoraz Motorola Palo Alto (USA) Jean Arcady Meyer Univ. Pierre et Marie Curie - Paris 6 (F) Jos? Mira UNED (E) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Gilles Pag?s Univ. Pierre et Marie Curie - Paris 6 (F) Thomas Parisini Politecnico di Milano (I) H?l?ne Paugam-Moisy Univ. Lumi?re Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Leonardo Reyneri Politecnico di Torino (I) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) John Stonham Brunel University (UK) Johan Suykens KUL Leuven (B) John Taylor King?s College London (UK) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) From mw at stat.Duke.EDU Thu Sep 30 10:40:35 1999 From: mw at stat.Duke.EDU (Mike West) Date: Thu, 30 Sep 1999 10:40:35 -0400 Subject: Faculty Positions Available At Duke University Message-ID: <19990930104035.B3972@isds.duke.edu> Dear Colleague I would appreciate your assistance in bringing the vacancies below to the attention of potential candidates, and in forwarding the ad to your departmental colleagues. Thanks. Mike W =========================================================== Mike West Arts & Sciences Professor of Statistics & Decision Sciences Director, Institute of Statistics & Decision Sciences Duke University, Durham, NC 27708-0251. USA tel/fax: (919) 684-8842/8594 http://www.stat.duke.edu =========================================================== ********************************************** STATISTICS AND BIOSTATISTICS FACULTY POSITIONS Institute of Statistics & Decision Sciences DUKE UNIVERSITY *************** Duke University has openings for tenured and tenure-track faculty, to begin in Fall 2000. We invite applications and nominations for the positions detailed below. (a) Full professor in the Institute of Statistics and Decision Sciences (ISDS), and (b) Assistant professor in Biostatistics in the School of Medicine, with a joint appointment in ISDS. Suitable applicants for appointment as tenured Professor of Statistics and Decision Sciences will be recognised research leaders in statistics. We are particularly interested in hearing from potential applicants in Bayesian statistics and related areas, and with disciplinary interests in biomedical applications. In collaboration with other departments at Duke, ISDS is developing a range of activities in statistical genetics and bioinformatics more broadly, and so particularly encourages applicants whose applied interests relate to these areas. Applications and nominations should be sent to Mike West, Director, ISDS, Duke University, Durham NC 27708-0251. Applications received by January 15th 2000 will be guaranteed full consideration. Appointment at the assistant professor level will be tenure track in the Division of Biometry in the School of Medicine, with a secondary appointment in ISDS. The appointee will work on cancer-related research projects and cancer clinical trials in the Biostatistics Unit of the Duke Comprehensive Cancer Center, and will have teaching and research roles in both ISDS and Biometry. A suitable applicant will hold a PhD in statistics or biostatistics, and have evident potential for excellence in research in biomedical statistics and quality teaching. Some background in areas involving collaborative medical research, clinical trials and interactions with medical research investigators will be beneficial. Applicants should mail cv and letter of application, and arrange for three letters of reference to be sent to, the Faculty Search Committee, Box 3958, Duke University Medical Center, Durham, NC 27710. Applications received by January 15th 2000 will be guaranteed full consideration. Additional appointments in biostatistics, including non-tenure/research track positions, may be available. Applications from suitably qualified women and minority candidates, for each of the above positions, are particularly encouraged. Duke University is an Equal Opportunity/Affirmative Action Employer. ****************************************************** Further information is available at the ISDS web site: http://www.stat.duke.edu ****************************************************** From radu_d at fred.EECS.Berkeley.EDU Thu Sep 30 13:45:54 1999 From: radu_d at fred.EECS.Berkeley.EDU (Radu Dogaru) Date: Thu, 30 Sep 1999 10:45:54 -0700 (PDT) Subject: Paper on a compact, simple and efficient neural architecture In-Reply-To: Message-ID: Dear Connectionists, The following paper is available and can be downloaded from http://trixie.eecs.berkeley.edu/~radu_d/publications.html#p or http://trixie.eecs.berkeley.edu/~radu_d/dogaru_ijcnn99.pdf All comments welcome. Perceptrons Revisited: The Addition of a Non-monotone Recursion Greatly Enhances their Representation and Classification Properties Radu Dogaru, Marinel Alangiu, Matthias Rychetsky and Manfred Glesner Abstract In this paper we describe a novel neural architecture and compare its representation and classification performances with classic solutions. It combines linear units with a compact and simple to implement non-linear transform defined as a finite recursion of simple non-monotonic functions. When such a nonlinear recursion replaces the standard output function of a perceptron-like structure, the representation capability of Boolean functions enhances beyond that of the standard linear threshold gates and arbitrary Boolean functions can be learned. For example the realization of Parity function with 8 inputs requires only 8 synapses and 3 nonlinear units. While the use of nonlinear recursion at the output accounts for compact learning and memorization of arbitrary functions, it was found that good generalization capabilities are obtained when the nonlinear recursion is placed at the inputs. It is thus concluded that the proper addition of a simple nonlinear structure to the well known linear perceptron removes most of its drawbacks, the resulting architecture being compact, easy to implement, and functionally equivalent to more sophisticated neural systems. --------------------------------------------------------- Dr. Radu Dogaru c/o Prof. Leon O. Chua University of California at Berkeley Department of Electrical Engineering and Computer Science Cory Hall #1770 Berkeley, CA 94720 - 1770 Tel: (510) 643-8868 Fax: (510) 643-8869 E-mail: radu_d at fred.EECS.Berkeley.EDU http://trixie.eecs.berkeley.edu/~radu_d _________________________________________________________ From fayyad at MICROSOFT.com Thu Sep 30 14:21:07 1999 From: fayyad at MICROSOFT.com (Usama Fayyad) Date: Thu, 30 Sep 1999 11:21:07 -0700 Subject: SIGKDD Explorations: call for paper for vol. 1 issue 2 Message-ID: This is to announce that the second issue of SIGKDD Explorations, the official newsletter of the ACM's new Special Interest Group (SIG) on Knowledge Discovery and Data Mining will be published by the end of the year. The first issue is available online at http://research.microsoft.com/datamine/SIGKDD. SIGKDD Explorations newsletter is sent to the ACM SIGKDD membership and to a world-wide network of libraries. The ACM SIGKDD is a new special interest group and has grown to over 1000 members in its first 6 months. We invite submissions to the second issue which will be published by year end. We are particularly interested in getting short research and survey articles on various aspects of data mining and KDD. Submissions can be made in any one of the following categories. - survey/tutorial articles (short) on important topics not exceeding 20 pages - topical articles on problems and challenges - well-articulated position papers - technical articles not exceeding 15 pages. - news items on the order of 1-3 paragraphs - Brief announcements not exceeding 5 lines in length. - review articles of products and methodologies not exceeding 20 pages - reviews/summaries from conferences, panels and special meetings. - reports on relevant meetings and committees related to the field Submissions should be made to fayyad at acm.org or sunita at cs.berkeley.edu. All submissions must arrive by October 20, 1999 for inclusion in the next issue. Please provide URL if there is associated web information. Some words about the SIGKDD newsletter: -------------------------------------- SIGKDD Explorations is a bi-annual newsletter dedicated to serve the SIGKDD membership and community. Our goal is to make SIGKDD Newsletter a very informative, rapid publication, and interesting forum for communicating with SIGKDD community. Submissions will be reviewed by the editor and/or associate editors as apporpriate. The distribution will be very wide (on the web, to all members but probably not restricted access the first year, and to ACM's world-wide network of libraries. Members get e-mail notifications of new issues and get hardcopies if they desire). For more information on SIGKDD visit http://www.acm.org/sigkdd and for more information on the newsletter visit http://research.microsoft.com/datamine/SIGKDD. Usama Fayyad, Editor-in-Chief fayyad at acm.org Sunita Sarawagi, Associate Editor sunita at cs.berkeley.edu