From obermaier at dpmi.tu-graz.ac.at Thu Jul 1 04:40:43 1999 From: obermaier at dpmi.tu-graz.ac.at (Bernhard Obermaier) Date: Thu, 01 Jul 1999 10:40:43 +0200 Subject: Paper available: HMM used for Brain Computer Interface Message-ID: <377B298B.6981C13B@dpmi.tu-graz.ac.at> The following paper 'HMM used for the offline classification of EEG data' is published in 'Biomedizinsche Technik' 1999/6 and is available from my web page: http://www-dpmi.tu-graz.ac.at/~obermai/ Abstract: Hidden Markov models (HMM) are introduced for the offline classification of single-trail EEG data in a brain-computer interface (BCI). The HMMs are used to classify Hjorth parameters calculated from bipolar EEG data, recorded during the imagination of a left or right hand movement. The effects of different types of HMMs on the recognition rate are discussed. Furthermore a comparison of the results achieved with the linear discriminat (LD) and the HMM, is presented. ----------------------------------------------------------------------------------------- Dipl.-Ing. Bernhard Obermaier obermaier at dpmi.tu-graz.ac.at Institute of Biomedical Engineering Department of Medical Informatics Infeldgasse 16a, A-8010 Graz, Austria Phone: +43-316-873-5311 | Fax: +43-316-873-5349 ----------------------------------------------------------------------------------------- From mozer at cs.colorado.edu Thu Jul 1 16:06:13 1999 From: mozer at cs.colorado.edu (Mike Mozer) Date: Thu, 01 Jul 99 14:06:13 -0600 Subject: position available in speech recognition research Message-ID: <199907012006.OAA19334@neuron.cs.colorado.edu> Senior Research Technologist Sensory Inc. Sunnyvale, CA http://www.sensoryinc.com Working under general supervision, this person will develop novel speech encoding and recognition algorithms, and will adapt existing algorithms to low-end platforms with memory and processor-speed constraints. Primary Responsibilities/Duties * Provides in-house corporate expertise for traditional methods of speech recognition. * Working from general concepts, develops specifications and models for speech processing and recognition algorithms. * Develops testing and cross-validation protocols, methods, and tools for performance evaluation and improvement. * Exercises experience and knowledge to optimize software performance consistent with constraints imposed by implementation on Sensory's custom ICs. * Works with assembly-language programmers to assist in porting algorithms from C to chip implementation. Position Requirements, Desired Experience and Skills: * MS in computer science, engineering, applied statistics, computational linguistics, or cognitive science; Ph.D. preferred * Strong academic, research, and/or development background in signal processing for speech and automatic speech recognition. * Minimum of 2 years' experience with speech feature encoding and recognition algorithms (including HMMs, neural nets, vector quantization) * Experience in DSP desirable. Knowledge of LPC a plus. * Fluency in C * State-of-the-art knowledge of machine learning and speech recognition techniques * The candidate should enjoy discussing and debating complex technical issues. For additional information, please see Sensory Inc.'s web site (www.sensoryinc.com). From RK at hirn.uni-duesseldorf.de Thu Jul 1 16:29:46 1999 From: RK at hirn.uni-duesseldorf.de (Rolf Kotter) Date: Thu, 01 Jul 1999 22:29:46 +0200 Subject: Special Issue "Computational Neuroscience" Message-ID: <377BCFBA.F37F2C91@hirn.uni-duesseldorf.de> [apologies if you receive this announcement via several routes] REVIEWS IN THE NEUROSCIENCES SPECIAL ISSUE Computational Neuroscience Guest Editor: Rolf Ktter Contents: Trends in European Computational Neuroscience Rolf Ktter Contrast Adaptation and Infomax in Visual Cortical Neurons Pter Adorjn, Christian Piepenbrock, Klaus Obermayer Single Cell and Population Activities in Cortical-like Systems Flp Bazs, dm Kepecs, Mt Lengyel, Szabolcs Payrits, Krisztina Szaliszny, Lszl Zalnyi, Pter rdi Computational Models of Predictive and Memory-related Functions of the Hippocampus Roman Borisyuk, Michael Denham, Susan Denham, Frank Hoppensteadt Using Realistic Models to Study Synaptic Integration in Cerebellar Purkinje Cells Erik De Schutter Towards an Integration of Biochemical and Biophysical Models of Neuronal Information Processing: A Case Study in the Nigro-striatal System Rolf Ktter, Dirk Schirok Interaction of Cortex and Hippocampus in a Model of Amnesia and Semantic Dementia Jaap M.J. Murre Properties of the Evoked Spatio-temporal Electrical Activity in Neuronal Assemblies Giulietta Pinato, Pietro Parodi, Alessandro Bisso, Domenico Macr, Akio Kawana, Yasuhiko Jimbo, Vincent Torre What Can Robots Tell Us About Brains? A Synthetic Approach Towards the Study of Learning and Problem Solving Thomas Voegtlin, Paul F.M.J. Verschure Publication date: October, 1999. Further information and download of editorial: http://www.hirn.uni-duesseldorf.de/~rk/rins_toc.htm From foster at Basit.COM Thu Jul 1 16:49:12 1999 From: foster at Basit.COM (Foster John Provost) Date: Thu, 1 Jul 1999 16:49:12 -0400 (EDT) Subject: KDD-99 Contest Announcement Message-ID: <199907012049.QAA00693@knowledge.basit.com> 1999 KDD-CUP Contests This year there will be two contests in association with the 1999 ACM SIGKDD conference in San Diego. One contest involves open-ended knowledge discovery, using clustering algorithms, rule discovery, and other methods for acquiring high-level knowledge from commercial data. The other contest involves building a classifier for detecting computer network intrusions from a very large database of network traffic records. The deadline for submitting entries is July 25. Awards will be presented at KDD-99 in August. For further details see the KDD-99 home page: http://research.microsoft.com/datamine/kdd99/ which will take you to the 1999 KDD-CUP web page: http://www.deas.harvard.edu/courses/cs281r/cup99.html Important note: the second URL may change, and updated contest information may be posted at any time. Those interested should monitor the KDD-99 home page. From ml_conn at infrm.kiev.ua Fri Jul 2 05:49:31 1999 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Fri, 2 Jul 1999 11:49:31 +0200 (UKR) Subject: Connectionist symbol processing: any progress? References: Message-ID: Keywords: compositional distributed representations, sparse coding, binary coding, binding, representation of structure, recursive representation, nested representation, connectionist symbol processing, long-term memory, associative-projective neural networks, analogical reasoning. Dear Colleagues, The following paper draft (abstract enclosed) is available at: http://cogprints.soton.ac.uk/abs/comp/199907001 Dmitri A. Rachkovskij "Representation and Processing of Structures with Binary Sparse Distributed Codes". Also you may be interested in a related paper (abstract enclosed): http://cogprints.soton.ac.uk/abs/comp/199904008 Dmitri A. Rachkovskij & Ernst M. Kussul "Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning". Comments are welcome! Thank you and best regards, Dmitri Rachkovskij Encl: Representation and Processing of Structures with Binary Sparse Distributed Codes Abstract: The schemes for compositional distributed representations include those allowing on-the-fly construction of fixed dimensionality codevectors to encode structures of various complexity. Similarity of such codevectors takes into account both structural and semantic similarity of represented structures. In this paper we provide a comparative description of sparse binary distributed representation developed in the frames of the Associative-Projective Neural Network architecture and more well-known Holographic Reduced Representations of Plate and Binary Spatter Codes of Kanerva. The key procedure in Associative-Projective Neural Networks is Context-Dependent Thinning which binds codevectors and maintains their sparseness. The codevectors are stored in structured memory array which can be realized as distributed auto-associative memory. Examples of distributed representation of structured data are given. Fast estimation of similarity of analogical episodes by the overlap of their codevectors is used in modeling of analogical reasoning for retrieval of analogs from memory and for analogical mapping. -------- Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning Abstract: Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations oftraditional AI, as well as to the localist and microfeature-based connectionist representations. ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 22, 252022, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* From espaa at exeter.ac.uk Fri Jul 2 04:51:45 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Fri, 2 Jul 1999 09:51:45 +0100 (GMT Daylight Time) Subject: PAA Content of Papers In-Reply-To: Message-ID: PATTERN ANALYSIS AND APPLICATIONS JOURNAL SPRINGER-VERLAG SPECIAL ISSUE ON NEURAL NETWORKS IN IMAGE PROCESSING JUNE 1999, vol. 2, issue 2 http://www.dcs.exeter.ac.uk/paa Full Details with Abstracts at: http://www.dcs.ex.ac.uk/paa/vol2.htm ____________________________________ "Image Feature Extraction and Denoising by Sparse Coding" Erkki Oja, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland Aapo Hyvarinen, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland Patrick Hoyer, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland pp. 104-110 "The Applicability of Neural Networks to Non-linear Image Processing" Dick de Ridder, Applied Physics Department, Delft University of Technology, Delft, Netherlands Robert P.W. Duin, Applied Physics Department, Delft University of Technology, Delft, Netherlands Piet W Verbeek, Applied Physics Department, Delft University of Technology, Delft, Netherlands Lucas J van Vliet, Applied Physics Department, Delft University of Technology, Delft, Netherlands pp. 111-128 "Outdoor Scene Classification by a Neural Tree-based Approach" G L Foresti, Department of Mathematics and Computer Science (DIMI), University of Udine, Udine, Italy pp. 129-142 "A Neural Network Approach to Planar-Object Recognition in 3D Space" H C Sim, Image, Speech and Intelligent Systems Research Group, Department of Electronics and Computer Science, University of Southampton, UK R I Draper, Image, Speech and Intelligent Systems Research Group, Department of Electronics and Computer Science, University of Southampton, UK pp. 143-163 "Color Image Indexing Using SOM for Region-of-Interest Retrieval" Tao Chen, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore Li-Hui Chen, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore Kai-Kuang Ma, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore pp. 164-171 "Detection of Bone Tumours in Radiographic Images using Neural Networks" M Egmont-Petersen, Division of Image Processing, Dept. of Radiology, Leiden University Medical Centre, The Netherlands E Pelikan, Scientific Technical Dept., Philips Medical Systems, Hamburg, Germany pp. 172-183 "Application of a Steerable Wavelet Transform using Neural Network for Signature Verification" Emad A. Fadhel, Department of Computer Science and Engineering, IIT-Bombay, India P Bhattacharya, Department of Computer Science and Engineering, IIT-Bombay, India pp. 184-195 "Reducing the Dimensions of Texture Features for Image Retrieval Using Multi-layer Neural Networks" Jose Antonio Catalan, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia Jesse Jin, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia Tom Gedeon, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia 196-203 __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From NEuroNet at kcl.ac.uk Fri Jul 2 05:47:20 1999 From: NEuroNet at kcl.ac.uk (NEuroNet) Date: Fri, 2 Jul 1999 10:47:20 +0100 (BST) Subject: NEuroNet WWW Site Message-ID: <199907020947.KAA29178@mail.kcl.ac.uk> This is to announce that NEuroNet, the European "Network of Excellence" in Neural Networks (funded by the 4th Framework Programme of the European Union) has a new WWW site at: http://www.kcl.ac.uk/neuronet/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Ms Terhi V Manuel-Garner, MA Administrator to NEuroNet NEuroNet, European Network of Excellence in Neural Networks Department of Electronic Engineering King's College London, Strand, London WC2R 2LS, UK Tel.: +44 (0) 171 848 2388 Fax: +44 (0) 171 848 2559 http://www.kcl.ac.uk/neuronet Email: NEuroNet at kcl.ac.uk - PLEASE NOTE NEW TELEPHONE AND FAX NOS - * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * From i.rezek at ic.ac.uk Fri Jul 2 12:31:30 1999 From: i.rezek at ic.ac.uk (Iead Rezek) Date: Fri, 02 Jul 1999 17:31:30 +0100 Subject: PH.D. STUDENT RESEARCH POSITION AT U.B.C. Message-ID: <377CE962.CE67477D@ic.ac.uk> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** Electrical & Computer Engineering Dept. University of British Columbia (UBC) http://www.ece.ubc.ca The Pattern Recognition research group in the Electrical & Computer Engineering Dept. of the University of British Columbia (UBC), seeks an appropriately qualified individual to work on the development of Software Engineering methodologies & tools for Pattern Recognition research and development. The applicant will have to apply for enrollment in a Ph.D. program, under the supervision of Prof. Rabab Ward FRSC, and will be expected to interact and cooperate with the other members of the research group. A full description of the project is attached to this email. Potential applicants please contact Dr. Nawwaf Kharma for an interview. The last date for submission of applications is July 31st 99 (for the session starting January 1st 2000). Late applicants will be considered for next academic year. The successful applicant will have to provide for one full year of tuition & maintenance costs, however, he/she can expect to work as a Teaching Assistant, during the following the years, for a total salary of about C$ 17K (enough to cover tuition plus most maintenance costs). We would be very grateful if you would post or share this information with other people interested in a Ph.D. in Pattern Recognition & Software Engineering, at UBC. Nawwaf N. Kharma E&CE Dept. UBC 2356 Main Mall, Vancouver, B.C. Canada V6T 1Z4 Tel: +1-604-822 1742 Fax: +1-604-822-5949 Email: nawwaf at ieee.org ================================================================ Ph.D. Project Description Purpose The aim of this project is to investigate the theoretical and pragmatic aspects of using guided random optimization techniques (e.g. Genetic Algorithms) for the automatic construction of pattern recognition software systems. The final goal of this project is the development of a software system capable (with little human intervention) of constructing another software system, one, which is (near-) optimally, suited to a specific recognition task. An example of a recognition task would be the recognizing of the set of words making up a hand-written Arabic sentence. Background & Motivation A Character recognition system may be broken up, functionally, into four components. One component carries out 'pre-processing' functions, such as normalization and thinning. Another component accepts the pre-processed input pattern and extracts those features that best characterize the pattern. The extracted features are used by a 'classification' component (such as a Neural Net) to assign a label to the pattern. All functions carried out after (initial) classification fall under 'post-processing'. [Image] [Image]The majority of the effort expended in the field of character recognition has been directed towards the discovery of new algorithms; ones that pre-process the input pattern, that extract the most relevant features, or that are most able to classify the patterns with increased certainty and efficiency. However, almost all of this effort has been done manually, though, there is no reason why a lot of this work cannot, in the future, be done by machines. This is exactly the reason for our interest in any/all techniques that are potentially capable of emulating some of Man's unique ability to search and optimize, and to do so efficiently. Objectives specific objectives of this project are: - To achieve a clear and comprehensive understanding of the processes of feature selection, and classification- both in character recognition systems and in humans. - To survey the wide rage of genetic and other evolutionary computation algorithms (& software), currently used in the realm of Pattern Recognition, and specifically: Character Recognition. - To acquire a good degree of working knowledge of recent methodologies of software engineering; such as object-oriented methodologies (including UML), as well as a reasonable level of software testing expertise. - To propose a hypothesis relating to the theoretical and pragmatic aspects of the automatic development of character/pattern recognition software systems. - To methodically specify, design, and code a software development platform capable of authoring/customizing software for specific character recognition purposes (in line with hypothesis- above.) - To validate the software development platform via extensive testing, including using it to actually construct a simple (but complete) example of a character recognition software. - To document all the above via various progress reports, scientific papers, a software user's manual, and (of course) a thesis. Skills Required Very good programming skills (in Delphi/C++), plus an interest in Software Engineering. Some basic background in Character Recognition or/and the ability to autonomously learn new concepts and techniques in Structural/Statistical Pattern Recognition. Good writing, presentation, and communication skills (in English). Patience! From ormoneit at stat.Stanford.EDU Fri Jul 2 20:24:18 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Fri, 2 Jul 1999 17:24:18 -0700 (PDT) Subject: Report on Local Linear Regression Message-ID: <199907030024.RAA15468@rgmiller.Stanford.EDU> The following technical report is now available on-line at http://www-stat.stanford.edu/~ormoneit/tr-1999-11.ps Best, Dirk ------------------------------------------------------------------ OPTIMAL KERNEL SHAPES FOR LOCAL LINEAR REGRESSION by Dirk Ormoneit and Trevor Hastie Local linear regression performs very well in many low-dimensional forecasting problems. In high-dimensional spaces, its performance typically decays due to the well-known ``curse-of-dimensionality''. Specifically, the volume of a weighting kernel that contains a fixed number of samples increases exponentially with the number of dimensions. The bias of a local linear estimate may thus become unacceptable for many real-world data sets. A possible way to control the bias is by varying the ``shape'' of the weighting kernel. In this work we suggest a new, data-driven method to estimating the optimal kernel shape. Experiments using two artificially generated data sets and data from the UC Irvine repository show the benefits of kernel shaping. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/~ormoneit/ From rreed at wport.com Sun Jul 4 00:01:02 1999 From: rreed at wport.com (Russ Reed) Date: Sat, 3 Jul 1999 21:01:02 -0700 Subject: new book Message-ID: This may interest readers of this list. NEW BOOK: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks Russell D. Reed & Robert J. Marks II (MIT Press, 1999). Contents: Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The basic idea is that massive systems of simple units linked together in appropriate ways can generate many complex and interesting behaviors. This book focuses on the subset of feedforward artificial neural networks called multilayer perceptions (MLP). These are the most widely used neural networks, with applications as diverse as finance (forecasting), manufacturing (process control), and science (speech and image recognition). This book presents an extensive and practical overview of almost every aspect of MLP methodology, progressing from an initial discussion of what MLPs are and how they might be used to an in-depth examination of technical factors affecting performance. The book can be used as a tool kit by readers interested in applying networks to specific problems, yet it also presents theory and references outlining the last ten years of MLP research. Table of Contents Preface 1 Introduction 1 2 Supervised Learning 7 3 Single-Layer Networks 15 4 MLP Representational Capabilities 31 5 Back-Propagation 49 6 Learning Rate and Momentum 71 7 Weight-Initialization Techniques 97 8 The Error Surface 113 9 Faster Variations of Back-Propagation 135 10 Classical Optimization Techniques 155 11 Genetic Algorithms and Neural Networks 185 12 Constructive Methods 197 13 Pruning Algorithms 219 14 Factors Influencing Generalization 239 15 Generalization Prediction and Assessment 257 16 Heuristics for Improving Generalization 265 17 Effects of Training with Noisy Inputs 277 A Linear Regression 293 B Principal Components Analysis 299 C Jitter Calculations 311 D Sigmoid-like Nonlinear Functions 315 References 319 Index 339 Ordering information: 1. MIT Press http://mitpress.mit.edu/book-home.tcl?isbn=0262181908 2. amazon.com http://www.amazon.com/exec/obidos/ASIN/0262181908/qid%3D909520837/sr%3D1-21/ 002-3321940-3881246 3. Barnes & Nobel http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=1KKG10OPZT& mscssid=A7M4XXV5DNS12MEG00CGNDBFPT573NJS&pcount=&isbn=0262181908 4. buy.com http://www.buy.com/books/product.asp?sku=30360116 From sbay at algonquin.ics.uci.edu Thu Jul 1 02:06:13 1999 From: sbay at algonquin.ics.uci.edu (Stephen D. Bay) Date: Wed, 30 Jun 1999 23:06:13 -0700 Subject: The UCI KDD Archive Message-ID: <9906302306.aa02757@paris.ics.uci.edu> ************************************************** The UCI KDD Archive Call for Datasets http://kdd.ics.uci.edu/ ************************************************** The UC Irvine Knowledge Discovery in Databases (KDD) Archive is a new online repository (http://kdd.ics.uci.edu/) of large datasets which encompasses a wide variety of data types, analysis tasks, and application areas. The primary role of this repository is to serve as a benchmark testbed to enable researchers in knowledge discovery and data mining to scale existing and future data analysis algorithms to very large and complex data sets. This archive is supported by the Information and Data Management Program at the National Science Foundation, and is intended to expand the current UCI Machine Learning Database Repository (http://www.ics.uci.edu/~mlearn/MLRepository.html) to datasets that are orders of magnitude larger and more complex. We are seeking submissions of large, well-documented datasets that can be made publicly available. Data types and tasks of interest include, but is not limited to: Data Types Tasks multivariate classification time series regression sequential clustering relational density estimation text/web retrieval image causal modeling spatial visualization multimedia discovery transactional exploratory data analysis heterogeneous data cleaning sound/audio recommendation systems Submission Guidelines: Please see the UCI KDD Archive web site for detailed instructions. Stephen Bay (sbay at ics.uci.edu) librarian From ralfh at cs.tu-berlin.de Tue Jul 6 05:19:57 1999 From: ralfh at cs.tu-berlin.de (Ralf Herbrich) Date: Tue, 6 Jul 1999 11:19:57 +0200 (MET DST) Subject: TR announcement Message-ID: We would like to announce the availability of the following technical report Bayesian Learning in Reproducing Kernel Hilbert Spaces Ralf Herbrich, Thore Graepel, Colin Campbell Abstract Support Vector Machines find the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere define the support vectors. An even more promising approach is to construct the hypothesis using the whole of version space. This is achieved by the Bayes point: the midpoint of the region of intersection of all hyperplanes bisecting version space into two volumes of equal magnitude. It is known that the centre of mass of version space approximates the Bayes point. The centre of mass is estimated by averaging over the trajectory of a billiard in version space. We derive bounds on the generalisation error of Bayesian classifiers in terms of the volume ratio of version space and parameter space. This ratio serves as an effective VC dimension and greatly influences generalisation. We present experimental results indicating that Bayes Point Machines consistently outperform Support Vector Machines. Moreover, we show theoretically and experimentally how Bayes Point Machines can easily be extended to admit training errors. The gziped PS file can be downloaded at http://stat.cs.tu-berlin.de/~ralfh/bayes.ps.gz Best regards Ralf Herbrich, Thore Graepel, and Colin Campbell ------------------------------------------------------------------- Ralf Herbrich phone : +49-30-314-25817 TU Berlin email : ralfh at cs.tu-berlin.de FB 13; FR 6-9 URL : http://stat.cs.tu-berlin.de/~ralfh 10587 Berlin Germany [teaching assistant in the statistics group at the TU Berlin] ------------------------------------------------------------------ From oby at cs.tu-berlin.de Tue Jul 6 11:04:39 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 6 Jul 1999 17:04:39 +0200 (MET DST) Subject: preprints available Message-ID: <199907061504.RAA03798@pollux.cs.tu-berlin.de> Dear connectionists, attached please find abstracts and preprint locations of four manuscripts on ANN theory and one short manuscript on visual cortex modelling: --- 1. self-organizing maps for similarity data & active learning (book chapter) 2. support vector learning for ordinal data (conference paper) 3. classification on proximity data with LP-machines (conference paper) 4. neural networks in economics (review) --- 5. contrast response and orientation tuning in a mean field model of visual cortex (conference paper) --- Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Active Learning in Self-Organizing Maps M. Hasenj\"ager^1, H. Ritter^1 and K. Obermayer^2 ^1 Technische Fakult\"at, Universit\"at Bielefeld, ^2 Fachbereich Informatik, Technische Universitaet Berlin The self-organizing map (SOM) was originally proposed by T. Kohonen in 1982 on biological grounds and has since then become a widespread tool for explanatory data analysis. Although introduced as a heuristic, SOMs have been related to statistical methods in recent years, which led to a theoretical foundation in terms of cost functions as well as to extensions to the analysis of pairwise data, in particular of dissimilarity data. In our contribution, we first relate SOMs to probabilistic autoencoders, re-derive the SOM version for dissimilarity data, and review part of the above-mentioned work. Then we turn our attention to the fact, that dissimilarity-based algorithms scale with O($D^2$), where {\it D} denotes the number of data items, and may therefore become impractical for real-world datasets. We find that the majority of the elements of a dissimilarity matrix are redundant and that a sparse matrix with more than 80% missing values suffices to learn a SOM representation of low cost. We then describe a strategy how to select the most informative dissimilarities for a given set of objects. We suggest to select (and measure) only those elements whose knowledge maximizes the expected reduction in the SOM cost function. We find that active data selection is computationally expensive, but may reduce the number of necessary dissimilarities by more than a factor of two compared to a random selection strategy. This makes active data selection a viable alternative when the cost of actually measuring dissimilarities between data objects comes high. in: Kohonen Maps (Eds. E. Oja and S. Kaski), Elsevier, pp. 57-70 (1999). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Support vector learning for ordinal regression R. Herbrich, T. Graepel, and K. Obermayer Fachbereich Informatik, Technische Universit\"at Berlin We investigate the problem of predicting variables of ordinal scale. This task is referred to as {\em ordinal regression} and is complementary to the standard machine learning tasks of classification and metric regression. In contrast to statistical models we present a distribution independent formulation of the problem together with uniform bounds of the risk functional. The approach presented is based on a mapping from objects to scalar utility values. Similar to Support Vector methods we derive a new learning algorithm for the task of ordinal regression based on large margin rank boundaries. We give experimental results for an information retrieval task: learning the order of documents w.r.t.\ an initial query. Experimental results indicate that the presented algorithm outperforms more naive approaches to ordinal regression such as Support Vector classification and Support Vector regression in the case of more than two ranks. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Classification on proximity data with LP-machines T. Graepel^1, R. Herbrich^1, B. Sch\"ollkopf^2, A. Smola^2, P. Bartlett^3, K. M\"uller^2, K. Obermayer^1, and R. Williamson^3 ^1 Fachbereich Informatik, Technische Universit\"at Berlin ^2 GMD FIRST ^3 Australian National University We provide a new linear program to deal with classification of data in the case of data given in terms of pairwise proximities. This allows to avoid the problems inherent in using feature spaces with indefinite metric in Support Vector Machines, since the notion of a margin is purely needed ininput space where the classification actually occurs. Moreover in our approach we can enforce sparsity in the proximity representation by sacrificing training error. This turns out to be favorable for proximity data. Similar to $\nu$--SV methods, the only parameter needed in the algorithm is the (asymptotical) number of data points being classified with a margin. Finally, the algorithm is successfully compared with $\nu$--SV learning in proximity space and $K$--nearest-neighbors on real world data from Neuroscience and molecular biology. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Neural Networks in Economics: Background, Applications and New Developments R. Herbrich, M. Keilbach, T. Graepel, and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Neural Networks were developed in the sixties as devices for classification and regression. The approach was originally inspired from Neuroscience. Its attractiveness lies in the ability to learn, i.e. to generalize to as yet unseen observations. One aim of this paper is to give an introduction to the technique of Neural Networks and an overview of the most popular architectures. We start from statistical learning theory to introduce the basics of learning. Then, we give an overview of the general principles of neural networks and of their use in the field of Economics. A second purpose is to introduce a recently developed Neural Network Learning technique, so called Support Vector Network Learning, which is an application of ideas from statistical learning theory. This approach has shown very promising results on problems with a limited amount of training examples. Moreover, utilizing a technique that is known as the kernel trick, Support Vector Networks can easily be adapted to nonlinear models. Finally, we present an economic application of this approach from the field of preference learning. in: Computational Techniques for Modelling Learning in Economics (Ed. T. Brenner), Kluwer Academics, in press (1999) available at: http://ni.cs.tu-berlin.de/publications/#books ----------------------------------------------------------------------------- On the Influence of Threshold Variability in a Model of the Visual Cortex H. Bartsch, M. Stetter, and K. Obermayer Fachbereich Informatik, Technische Universit\"at Berlin Orientation--selective neurons in monkeys and cats show contrast saturation and contrast--invariant orientation tuning. Recently proposed models for orientation selectivity predict contrast invariant orientation tuning but no contrast saturation at high strength of recurrent intracortical coupling, whereas at lower coupling strengths the contrast response saturates but the tuning widths are contrast dependent. In the present work we address the question, if and under which conditions the incorporation of a stochastic distribution of activation thresholds of cortical neurons leads to the saturation of the contrast response curve as a network effect. We find that contrast saturation occurs naturally if two different classes of inhibitory inter-neurons are combined. Low threshold inhibition keeps the gain of the cortical amplification finite, whereas high threshold inhibition causes contrast saturation. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference From fmdist at hotmail.com Tue Jul 6 13:08:37 1999 From: fmdist at hotmail.com (Fionn Murtagh) Date: Tue, 06 Jul 1999 10:08:37 PDT Subject: Post-Doc Research Fellowship available, 3 yrs Message-ID: <19990706170837.65673.qmail@hotmail.com> Post-Doc Research Fellowship available, 3 yrs Research areas of particular interest - non-exhaustive: - image display and transfer, eyegaze monitoring - image and video compression, videoconferencing - wavelet transforms, multiscale methods, neural networks - multivariate data analysis, signal processing - space, medical, financial data and image analysis Information: Prof. F. Murtagh, School of Computer Science, The Queen's University of Belfast, Belfast BT7 1NN, Nth Ireland http://www.cs.qub.ac.uk/~F.Murtagh, email f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From klweber at Math.TU-Cottbus.DE Tue Jul 6 05:35:58 1999 From: klweber at Math.TU-Cottbus.DE (Klaus Weber) Date: Tue, 6 Jul 1999 11:35:58 +0200 (MET DST) Subject: NC'2000 Announcement and CfP Message-ID: <199907060935.LAA02849@vieta.math.tu-cottbus.de.math.tu-cottbus.de> ANNOUNCEMENT / CALL FOR PAPERS Second International ICSC Symposium on NEURAL COMPUTATION / NC'2000 To be held at the Technical University of Berlin, Germany May 23-26, 2000 http://www.icsc.ab.ca/nc2000.htm SYMPOSIUM CHAIR Prof. Hans Heinrich Bothe Oerebro University Dept. of Technology Science Fakultetsgatan 1 S - 70182 Oerebro, Sweden Email: hans.bothe at ton.oru.se Phone: +46-19-10-3786 Fax: +46-19-10-3463 and Technical University of Denmark (DTU) Department of Information Technology Building 344 DK-2800 Lyngby, Denmark Email: hhb at it.dtu.dk Phone: +45-4525-3632 Fax: +45-4588-0117 PUBLICATION CHAIR Prof. Raul Rojas Freie Universit=E4t Berlin Institut Informatik / FB Mathematik Takustrasse 9 D - 14195 Berlin / Germany Email: rojas at inf.fu-berlin.de Fax: +49-30-8387-5109 SYMPOSIUM ORGANIZER ICSC International Computer Science Conventions P.O. Box 657 CH-8055 Zurich / Switzerland Phone: +41-878-888-150 Fax: +41-1-761-9627 Email: icsc at icsc.ch WWW: http://www.icsc.ab.ca INTERNATIONAL PROGRAM COMMITTEE Igor Aleksander, Imperial College of Science & Technology, London, U.K. Peter G. Anderson, Rochester Institute of Technology, NY, USA Horst Bischof, Technical University Vienna, Austria Ruediger W. Brause, J.W. Goethe-University, Frankfurt, Germany Juan Lopez Coronado, Universidad Polotecnica de Cartagena, Spain Ludwig Cromme, Brandenburgische Technische Universitaet Cottbus, Germany Chris deSilva, University of Western Australia Crawley, Australia Georg Dorffner, Austrian Research Institute for Artificial Intelligence, Vienna, Austria Kunihiko Fukushima, The University of Electro-Communications, Tokyo, Japan Wulfram Gerstner, EPFL Lausanne, Switzerland Stan Gielen, University of Nijmegen, Netherlands Marco Gori, University of Siena, Italy Bruce Graham, University of Edinburgh, U.K. Gunham Dundar, Bosporus University, Istanbul, Turkey Chris J. Harris, University of Southampton, U.K. Dorothea Heiss-Czedik, Technical University Vienna, Austria Michael Heiss, Siemens Vienna, Austria Lakhmi C. Jain, University of South Australia, The Levels, Australia Nikola Kasabov, University of Otago, Dunedin, New Zealand Bart Kosko, University of Southern California, Los Angeles CA, USA Rudolf Kruse, University of Magedburg, Germany Fa-Long Luo, R&D Department, Redwood City, USA G.Nicolas Marichal, University of La Laguna Tenerife, Spain Giuseppe Martinelli, University of Rome 1, Italy Fazel Naghdy, University of Wollongong, Australia Sankar K. Pal, Indian Statistical Institute, Calcutta, India M. Palaniswami, University of Melbourne, Australia Yoh-Han Pao, Case Western Reserve University, Cleveland OH, USA Alexander V. Pavlov, Laboratory for Optical Fuzzy Systems, Russia Witold Pedrycz, University of Alberta, Canada Raul Rojas, Freie Universit=E4t Berlin, Germany V. David Sanchez A., Falon, Inc., San Diego CA, USA Bernd Schuermann, Siemens ZFE, Munich, Germany J.S. Shawe-Taylor, Royal Holloway University of London, U.K. Peter Sincak, Technical University of Kosice, Slovakia Nigel Steele, Coventry University, U.K. Rainer Stotzka, Forschungszentrum Karlsruhe, Germany Piotr Szczepaniak, Technical University of Lodz, Poland Csaba Szepesvari, University of Szeged, Hungary Henning Tolle, Technische Hochschule Darmstadt, Germany Shiro Usui, Toyohashi University of Technology, Toyohashi, Japan Klaus Weber, Technical University of Cottbus, Germany Andreas Weingessel, Technical University Vienna, Austria Takeshi Yamakawa, Kyushu Institute of Technology, Fukuoka, Japan Andreas Zell, Universitaet Tuebingen, Germany Jacek M. Zurada, University of Louisville, K.Y., USA (list incomplete) ************************************************* INTRODUCTION The science of neural computation focusses on mathematical aspects to solve complex practical problems, and it also seeks to help neurology, brain theory and cognitive psychology in the understanding of the functioning of the nervous system by means of computational models of neurons, neural nets and subcellular processes. NC'2000 aims to become a major point of contact for research scientists, engineers and practitioners throughout the world in the field of Neural Computation. Participants will share the latest research, developments and ideas in the wide arena of disciplines encompassed under the heading of NC'2000 as a follow-up of the most successful NC'98 conference in Vienna, Austria. ************************************************* TOPICS Contributions are sought in areas based on the list below, which is indicative only. Contributions from new applications areas are welcome. COMPUTATIONAL NEURAL NETWORK MODELS - Artificial neural network paradigms - Knowledge representation - Learing and generalization - Probabilistic neural networks - Information theoretic approaches - Time-coded neural networks - Pulse-coded neural networks - Self-organization - Cellular automata - Hybrid systems (e.g. neuro-fuzzy, GA, evolutionary strategies) - Chaos in neural networks - Statistical and numerical aspects NEUROPHYSIOLOGICALLY INSPRIED MODELS - Neurophysiological foundations - Spiking neuron models and neuron assemblies - Models of brain centers and sensory pathways - Sensormotor integration - Sensation, Perception and Attention - Spatio-temporal Orientation - Reactive Behavior SOFTWARE AND HARDWARE IMPLEMENTATIONS - Simulation and Graphical Programming Tools - Distributed Systems - Neuro-chips, -controllers and -computers - Analog and Digital Electronic Implementations - Optic, Holographic Implementations NEURAL NETWORK APPLICATIONS - Pre-processing and Feature Extraction - Sound, Speech and Image Processing - Pattern Recognition and System Identification - Computer Vision, Feature Binding and Image Understanding - Autonomous Sensor Systems, Multivariate Sensor Fusion - Robotics and Control - Behavior based Exploration and Planning - Power Systems - Environmental Systems - Decision Support Systems - Medical Applications - Operational Research and Logistics ************************************************* SCIENTIFIC PROGRAM NC'2000 will include invited plenary talks, contributed sessions, invited sessions, workhops and tutorials. ************************************************* INVITED SESSIONS The organization of invited sessions is encouraged. Prospective organizers are requested to send a session proposal (consisting of 4-5 invited papers, the recommended session-chair and co-chair, as well as a short statement describing the title and the purpose of the session to the Symposium Chairman or the Symposium Organizer. Invited sessions should preferably start with a tutorial paper. The registration fee of the session organizer will be waived, if at least 4 authors of invited papers register to the conference. ************************************************* POSTER PRESENTATIONS Poster presentations are encouraged for people who wish to receive peer feedback and practical examples of applied research are particularly welcome. Poster sessions will allow the presentation and discussion of respective papers, which will also be included in the conference proceedings. ************************************************* WORKSHOPS, TUTORIALS AND OTHER CONTRIBUTIONS Proposals should be submitted as soon as possible to the Symposium Chairman or the Symposium Organizer. ************************************************* SUBMISSION OF PAPERS Prospective authors are requested to either send a draft paper or an extended abstract for review by the International Program Committee. All papers must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. Submissions must be received by October 31, 1999. Regular papers, as well as poster presentations, tutorial papers and invited sessions are encouraged. The abstract should also include: - Title of conference (NC'2000) - Type of paper (regular, poster, tutorial or invited) v- Title of proposed paper - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author - Topics which best describe the paper (max. 5 keywords) - Short CV of authors (recommended) Contributions are welcome from those working in industry and having experience in the topics of this conference as well as from academics. The conference language is English. It is strongly recommended to submit abstracts by electronic mail to icsc at icsc.ch or else by fax or mail (2 copies) to the following address: ICSC Switzerland P.O. Box 657 CH-8055 Zurich Switzerland Fax: +41-1-761-9627 ************************************************* BEST PRESENTATION AWARDS The best oral and poster presentations will be honored with best presentation awards. ************************************************* PUBLICATIONS Conference proceedings (including all accepted papers) will be published by ICSC Academic Press and be available for the delegates at the symposium in printed form or on CD-ROM. Authors of a selected number of innovative papers will be invited to submit extended manuscripts for publication in prestigious international journals. ************************************************* v IMPORTANT DATES - Submission Deadline: October 31, 1999 - Notification of Acceptance: December 31, 1999 - Delivery of full papers: February 15, 2000 - Tutorials and Workshops: May 23, 2000 - NC'2000 Symposium: May 24-26, 2000 ************************************************* ACCOMMODATION Accommodation at reasonable rates will be available at nearby hotels. Full details will follow with the letters of acceptance. ************************************************* SOCIAL AND TOURIST ACTIVITIES A social program will be organized and also be available for accompanying persons. ************************************************* THE FASCINATING CITY OF BERLIN The old and new capital of Germany is a mecca for scientists and cultural enthusiasts, for day workers and night owls. Charming with its several opera houses, concert halls, cabarets, and beer gardens, Berlin is full of spontaneous cultural events or happenings. As an urban building site for the future, it is at the same time a living contradiction and a casual place with relaxation areas and large parks right in the city center. The fine nature, pine tree forests, and '1001' lakes around the city supply Berlin with it's very specific 'sparkling air' in spring time. No other city in Germany has during the last 100 years played such a prominent role in history and in the imagination of the people: social and industrial revolution, world war I and later manifestation of the first German republic, the 'Golden Twenties', nazis dictatorship and world war II, splitting by the Berlin Wall, 'economic miracle' in the west and socialistic showpiece city in the east, '68 student and alternative lifestyle movement in the west and peace movement in the east, and finally, the fall of the Wall. Berlin tempts with its many research facilities, and it is Germany's largest industrial city with headquarters or dependences of most major vcompanies. At present, approximately 3.5 million inhabitants live in the reunified city, among which more than 120.000 students, who study in three universities and twelve colleges or schools of arts. Berlin ist eine Reise wert. Welcome! ************************************************* FURTHER INFORMATION Fully updated information is available from http://www.icsc.ab.ca/nc2000.htm You may also contact - ICSC International Computer Science Conventions P.O. Box 657, CH-8055 Zurich, Switzerland Email: icsc at icsc.ch Phone: +41-878-888-150 Fax: +41-1-761-9627 or, for specific scientific requests, the symposium chairman. From brunel at asterix.ccs.brandeis.edu Tue Jul 6 16:58:49 1999 From: brunel at asterix.ccs.brandeis.edu (Nicolas Brunel) Date: Tue, 6 Jul 1999 16:58:49 -0400 (EDT) Subject: papers available Message-ID: Dear connectionists, The following two papers on the analysis of the dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons are now available on my web page: http://mumitroll.ccs.brandeis.edu/~brunel/journal.html "Fast global oscillations in networks of integrate-and-fire neurons with low firing rates" N Brunel and V Hakim to appear in Neural Computation, 11, 1621-1671 (1999) Abstract: We study analytically the dynamics of a network of sparsely connected inhibitory integrate-and-fire neurons in a regime where individual neurons emit spikes irregularly and at a low rate. In the limit when the number of neurons $N\rightarrow\infty$, the network exhibits a sharp transition between a stationary and an oscillatory global activity regime where neurons are weakly synchronized. The activity becomes oscillatory when the inhibitory feedback is strong enough. The period of the global oscillation is found to be mainly controlled by synaptic times, but depends also on the characteristics of the external input. In large but finite networks, the analysis shows that global oscillations of finite coherence time generically exist both above and below the critical inhibition threshold. Their characteristics are determined as functions of systems parameters, in these two different regimes. The results are found to be in good agreement with numerical simulations. "Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons" N Brunel to appear in Journal of Computational Neuroscience Abstract: The dynamics of networks of sparsely connected excitatory and inhibitory integrate-and-fire neurons is studied analytically. The analysis reveals a very rich repertoire of states, including: Synchronous states in which neurons fire regularly; Asynchronous states with stationary global activity and very irregular individual cell activity; States in which the global activity oscillates but individual cells fire irregularly, typically at rates lower than the global oscillation frequency. The network can switch between these states, provided the external frequency, or the balance between excitation and inhibition, is varied. Two types of network oscillations are observed: In the `fast' oscillatory state, the network frequency is almost fully controlled by the synaptic time scale. In the `slow' oscillatory state, the network frequency depends mostly on the membrane time constant. Finite size effects in the asynchronous state are also discussed. Nicolas Brunel Volen Center for Complex Systems, MS 013, Brandeis University 415 South Street, Waltham, MA 02254-9110, USA Tel (781) 736 2890 -- Fax (781) 736 4877 Email brunel at asterix.ccs.brandeis.edu http://mumitroll.ccs.brandeis.edu/~brunel From rinehart at scf-fs.usc.edu Tue Jul 6 16:57:13 1999 From: rinehart at scf-fs.usc.edu (John Rinehart) Date: Tue, 06 Jul 1999 13:57:13 -0700 Subject: conference: Replacement Parts for the Brain: Intracranial Implantation of Hardware Models of Neural Circuitry Message-ID: <3.0.3.32.19990706135713.007a83c0@scf.usc.edu> Replacement Parts for the Brain: Intracranial Implantation of Hardware Models of Neural Circuitry An NIMH, USC-AMI Sponsored Conference August 12-14 1999, Willard Hotel Washington D.C. This conference will bring together leading researchers throughout the country for a focus on one of the newest frontiers of neuroscientific and bioengineering research: the intracranial implantation of computer chip models of brain function as neural prosthetics to replace damaged or dysfunctional brain tissue. In considering the development of "replacement parts for the brain", speakers will address recent advances in (i) biologically realistic mathematical models of brain or spinal cord function, (ii) silicon- and/or photonics-based computational devices which incorporate those models, and (ii) "neuron-silicon interface" devices, i.e., micron-scale multi-site electrode arrays to provide bi-directional communication between the computational element and functioning neuronal tissue. The conference is intended to synergize interdisciplinary research in the neural, engineering, and biomedical sciences. For more information regarding the conference and meeting registration, please click on the conference web site address: http://www.usc.edu/dept/biomed/NeuralProstheticsCNF Conference Organizers: Theodore W. Berger, Ph.D. Director, Center for Neural Engineering 500 Olin Hall Department of Biomedical Engineering University of Southern California Los Angeles, CA 90089-1450 telephone: 213-740-8017 fax: 213-740-0343 e-mail: berger at bmsrs.usc.edu Dennis Glanzman, Ph.D. Chief, Theoretical Neuroscience Program NIMH Room 11-102 5600 Fishers Lane Rockville, MD telephone: 301-443-1576 fax: 301-443-4822 e-mail: glanzman at helix.nih.gov From sbaluja at lycos.com Wed Jul 7 23:53:56 1999 From: sbaluja at lycos.com (sbaluja@lycos.com) Date: Wed, 7 Jul 1999 23:53:56 -0400 Subject: Paper: Memory-based Face Recognition Message-ID: <852567A8.0014E52E.00@pghmta2.mis.pgh.lycos.com> Paper: High-Performance Memory-based Face Recognition for Visitor Identification JPRC-Technical Report-1999-01 Authors: Terence Sim, Rahul Sukthankar, Matthew D. Mullin & Shumeet Baluja Available from: http://www.cs.cmu.edu/~baluja/techreps.html & http://www.cs.cmu.edu/~rahuls/pub/ Abstract: We show that a simple, memory-based technique for view-based face recognition, motivated by the real-world task of visitor identification, can outperform more sophisticated algorithms that use Principal Components Analysis (PCA) and neural networks. This technique is closely related to correlation templates; however, we show that the use of novel similarity measures greatly improves performance. We also show that augmenting the memory base with additional, synthetic face images results in further improvements in performance. Results of extensive empirical testing on two standard face recognition datasets are presented, and direct comparisons with published work show that our algorithm achieves comparable (or superior) results. This paper further demonstrates that our algorithm has desirable asymptotic computational and storage behavior, and is ideal for incremental training. Our system is incorporated into an automated visitor identification system that has been operating successfully in an outdoor environment for several months. Contact: tsim at jprc.com, rahuls at jprc.com, mdm at jprc.com, sbaluja at lycos.com Comments and Questions are welcome! From peterw at cogs.susx.ac.uk Thu Jul 8 06:10:34 1999 From: peterw at cogs.susx.ac.uk (Peter Williams) Date: Thu, 8 Jul 1999 11:10:34 +0100 Subject: Senior post in Neural Computation Message-ID: UNIVERSITY OF SUSSEX SCHOOL OF COGNITIVE AND COMPUTING SCIENCES SENIOR FACULTY POST - Neural Computation/Computer Vision (Ref 133) Applicants are invited for a permanent faculty position, up to Chair level, within the Computer Science and Artificial Intelligence Subject Group of the School of Cognitive and Computing Sciences. The expected start date is 1 October 1999 or as soon as possible thereafter. Candidates should be able to show evidence of significant research achievement in Neural Computation or Computer Vision. The successful applicant will be expected to expand significantly the existing high research profile of the Group in this area. Applicants may also be expected to contribute to teaching in more general areas of Computer Science or Artificial Intelligence. The level of the appointment, Professor/Reader/Senior Lecturer, will be appropriate to the achievements and potential of the successful candidate. Salary in the range: 30,496 - 34,464 per annum (Senior Lecturer/ Reader) or 35,170 minimum per annum (Professorial). Salaries under review. Informal enquiries may be made to Dr Peter Williams on Tel +44 1273 678756. Email peterw at cogs.susx.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk. Closing date: Friday 13 August 1999. Application forms and further particulars are available from and should be returned to Liz Showler, Staffing Services Office, Sussex House, University of Sussex, Falmer, Brighton BN1 9RH, UK. Tel +44 1273 877324 Email E.S.Showler at sussex.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk. Details of all posts can also be found via the website below. http://www.susx.ac.uk/Units/staffing An Equal Opportunity Employer From cmbishop at microsoft.com Fri Jul 9 04:52:22 1999 From: cmbishop at microsoft.com (Christopher Bishop) Date: Fri, 9 Jul 1999 01:52:22 -0700 Subject: New book: "Neural Networks and Machine Learning" Message-ID: <3FF8121C9B6DD111812100805F31FC0D101F241E@RED-MSG-59> New book: "Neural Networks and Machine Learning" Christopher M. Bishop (Ed.) Springer-Verlag (for information on how to order: see below) Contents: B. D. Ripley: " Statistical Principles of Model Fitting" L. Brieman: "Bias-Variance, Regularization, Instability and Stabilization" J. M. Buhman and N. Tishby: "Empirical Risk Optimization: A Statistical Learning Theory of Data Clustering" E. D. Sontag: "VC Dimension of Neural Networks" R. M. Neal: "Assessing Relevance Determination Methods using DELVE" D. J. C. MacKay: "Introduction to Gaussian Processes" H. Zhu, C. K. I. Williams, R. Rohwer and M. Morciniec: "Gaussian Regression and Optimal Finite Dimensional Linear Models" T. S. Jaakkola and M. I. Jordan: "Variational Methods and the QMR-DT Database" D. Barber and C. M. Bishop: "Ensemble Learning in Bayesian Neural Networks" V. Vapnik: "The Support Vector Method of Function Estimation" B. Sallans, G. E. Hinton and Z. Ghahramani: "A Hierarchical Community of Experts" E. B. Baum: "Manifesto for an Evolutionary Economics of Intelligence" Preface: From harnad at coglit.ecs.soton.ac.uk Fri Jul 9 12:47:34 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 9 Jul 1999 17:47:34 +0100 (BST) Subject: SPEECH RECOGNITION: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article MERGING INFORMATION IN SPEECH RECOGNITION: FEEDBACK IS NEVER NECESSARY by Norris D., McQueen J. M., Cutler A., *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL by July 21st to: bbs at cogsci.soton.ac.uk or write to Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection on the Web. _____________________________________________________________ MERGING INFORMATION IN SPEECH RECOGNITION: FEEDBACK IS NEVER NECESSARY Norris Dennis. Medical Research Council Cognition and Brain Sciences Unit, 15, Chaucer Rd., Cambridge, CB2 2EF, U.K. Dennis.Norris at mrc-cbu.cam.ac.uk http://www.mrc-cbu.cam.ac.uk/ James M. McQueen and Anne Cutler Max-Planck-Institute for Psycholinguistics, Wundtlaan 1, 6525 XD Nijmegen, The Netherlands James.McQueen at mpi.nl and Anne.Cutler at mpi.nl http://www.mpi.nl ABSTRACT: Top-down feedback does not benefit speech recognition; on the contrary, it can hinder it. No experimental data imply that feedback loops are required for speech recognition. Feedback is accordingly unnecessary and spoken word recognition is modular. To de fend this thesis we analyse lexical involvement in phonemic decision-making. TRACE (McClelland & Elman 1986), a model with feedback from the lexicon to prelexical processes, is unable to account for all the available data on phonemic decision-making. The modular Race model (Cutler & Norris 1979) is likewise challenged by some recent results however. We therefore present a new modular model of phonemic decision-making, the Merge model. In Merge, information flows from prelexical processes to the lexicon without feedback. Because phonemic decisions are based on the merging of prelexical and lexical information, Merge correctly predicts lexical involvement in phonemic decisions in both words and nonwords. Computer simulations show how Merge is able to account for the data through a process of competition between lexical hypotheses. We discuss the issue of feedback in other areas of language processing, and conclude that modular models are particularly well suited to the problems and constraints of speech recognition. KEYWORDS: feedback, modularity, phonemic decisions, lexical processing, computational modelling, word recognition, speech recognition, reading, ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web from the US or UK BBS Archive. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.norris.html ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From harnad at coglit.ecs.soton.ac.uk Fri Jul 9 13:00:58 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 9 Jul 1999 18:00:58 +0100 (BST) Subject: LOCALIST CONNECTIONISM: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article CONNECTIONIST MODELLING IN PSYCHOLOGY: A LOCALIST MANIFESTO by Mike Page *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL by July 21st to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection at BBS's Princeton or Southampton Website. _____________________________________________________________ CONNECTIONIST MODELLING IN PSYCHOLOGY: A LOCALIST MANIFESTO Mike Page Medical Research Council Cognition and Brain Sciences Unit, 15, Chaucer Rd., Cambridge, CB2 2EF, U.K. mike.page at mrc-cbu.cam.ac.uk http://www.mrc-cbu.cam.ac.uk/ ABSTRACT: Over the last decade, fully-distributed models have become dominant in connectionist psychological modelling, whereas the virtues of localist models have been underestimated. This target article illustrates some of the benefits of localist modelling. Localist models are characterized by the presence of localist representations rather than the absence of distributed representations. A generalized localist model is proposed that exhibits many of the properties of fully distributed models. It can be applied to a number of problems that are difficult for fully distributed models and its applicability can be extended through comparisons with a number of classic mathematical models of behaviour. There are reasons why localist models have been underused and thes e are addressed. In particular, many conclusions about connectionist representation, based on neuroscientific observation, are called into question. There are still some problems inherent in the application of fully distributed systems and some inadequacies in proposed solutions to these problems. In the domain of psychological modelling, localist modelling is to be preferred. KEYWORDS: connectionist modelling, neural networks, localist, distributed, competition, choice, reaction-time, consolidation. ___________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web at the US or UK BBS Archive. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.page.html ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From gerstner at lamisun1.epfl.ch Mon Jul 12 11:59:08 1999 From: gerstner at lamisun1.epfl.ch (Wulfram Gerstner) Date: Mon, 12 Jul 1999 17:59:08 +0200 (MET DST) Subject: PhD-studentship Message-ID: <199907121559.RAA09199@mantrasun8.epfl.ch> 2 PhD scholarships are available at the Swiss Federal Institute of Technology in Lausanne (EPFL) for thesis research on A) Information Theoretic Approaches to Spike Coding B) Kernels and Support Vector Machines for Vision The PhD-students will be part of the Center for Neuromimetic Systems, see http://diwww.epfl.ch/mantra Highly motivated candidates with a strong theory background (math,physics) and an interest in statistical learning theory, information theory, and machine learning should apply by sending their CV together with the name of a reference to wulfram.gerstner at di.epfl.ch. Subject line: application Unformated text or postscript format only. --------------------------------------------------------------------- Wulfram Gerstner Swiss Federal Institute of Technolgy Lausanne Assistant Professor Centre for Neuro-mimetic Systems Computer Science Department, EPFL, IN-J, 032 1015 Lausanne EPFL Tel. +41-21-693 6713 wulfram.gerstner at di.epfl.ch Fax. +41-21-693 5263 http://diwww.epfl.ch/mantra --------------------------------------------------------------------- From moris at ims.u-tokyo.ac.jp Mon Jul 12 23:31:16 1999 From: moris at ims.u-tokyo.ac.jp (Shinichi Morishita) Date: Tue, 13 Jul 1999 12:31:16 +0900 Subject: Call for Papers: PAKDD2000 Message-ID: <01BECD2B.9A4EDCE0@moris-dh223.ims.u-tokyo.ac.jp> ======================================================================== CALL FOR PAPERS: PAKDD-2000 ======================================================================== The Fourth Pacific-Asia Conference on Knowledge Discovery and Data Mining Kyoto, Japan April 18-20, 2000 Papers Due: October 10, 1999 Sponsored by: (to be confirmed) Japanese Society of Artificial Intelligence SIG-KBS (Knowledge Base Systems) SIG-FAI (Fundamental AI) The Institute of Electronics, Information and Communication Engineers ?SIG-DE (Data Engineering), SIG-AI (Artificial Intelligence) Japan Society for Software Science and Technology ?SIG-DM (Data Mining) Information Processing Society of Japan ?SIG-DB (Data Base) SIC-ICS (Intelligent & Complex Systems) ACM SIG-MOD Japan Keihanna Interaction Plaza, Inc. The Fourth Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD-2000) will provide an international forum for the sharing of original research results and practical development experiences among researchers and application developers from different KDD related areas such as machine learning, databases, statistics, knowledge acquisition, data visualization, knowledge-based systems, soft computing, and high performance computing. It will follow the success of PAKDD-97 held in Singapore in 1997, PAKDD-98 held in Australia in 1998, and PAKDD-99 held in China in 1999 by bringing together participants from universities, industry and government. Papers on all aspects of knowledge discovery and data mining are welcome. Areas of interest include, but are not limited to: - Theory and Foundational Issues in KDD * Data and Knowledge Representation * Logic for/of Knowledge Discovery * Expanding the Autonomy of Machine Discovers * Human Factors * Scientific Discovery * New Theory, Philosophy, and Methodology - KDD Algorithms and Methods * Machine Learning Methods * Statistical Methods * Heuristic Search * Inductive Logic Programming * Deduction, Induction and Abduction * Discovery of Exceptions and Deviations * Multi-criteria Evaluation and Data Mining Metrics * Hybrid and Multi-agent Methods * Evaluation of Complexity, Efficiency, and Scalability of Algorithms - Process-Centric KDD * Models and Framework of the Knowledge Discovery Process * Data and Dimensionality Reduction * Preprocessing and Postprocessing * Interestingness Checking of Data and Rules * Management and Refinement for the Discovered Knowledge * Decomposition of Large Data Sets * Discretisation of Continuous Data * Data and Knowledge Visualization * Role of Domain Knowledge and Reuse of Discovered Knowledge * KDD Process and Human Interaction - Soft Computing for KDD * Information Granulation and Granular Computing * Rough Sets in Data Mining * Neural Networks, Probabilistic Reasoning * Noise Handling and Uncertainty Management * Hybrid Symbolic/Connectionist KDD Systems - High Performance Data Mining and Applications * Multi-Database Mining * Data Mining in Advanced Databases (OODB, Spatial DB, Multimedia DB) * Database Reverse Engineering * Integration of Data Warehousing, OLAP and Data Mining * Combining Data Mining with Database Querying * Parallel and Distributed Data Mining * Data Mining on the Internet * Multi-agent, Multi-task KDD Systems * Data Mining from Unstructured and Multimedia Data * Unification of Data Mining with Intelligent Information Retrieval * Security and Privacy Issues * Successful/Innovative KDD Applications in Science, Engineering, Medicine, Business, Education, Government, and Industry Both research and applications papers are solicited. All submitted papers will be reviewed on the basis of technical quality, relevance to KDD, originality, significance, and clarity. Accepted papers are expected to be published in the conference proceedings by Springer-Verlag in the Lecture Notes in Artificial Intelligence series. A selected number of PAKDD-2000 accepted papers will be expanded and revised for inclusion in major Japanese and/or international journals. Candidates include "Knowledge and Information Systems: An International Journal" by Springer-Verlag (http://kais.mines.edu/?kais/). PAKDD Best Paper Award will be conferred on the authors of the best paper at the conference. The winner will be honored US$500 and free registration. Authors are invited to email postscript files of their papers to terano at gssm.otsuka.tsukuba.ac.jp or to send Four copies of them to Prof. Takao Terano (PAKDD-2000) Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan Tel.: +81-3-3942-6855 Fax.: +81-3-3942-6829 Email: terano at gssm.otsuka.tsukuba.ac.jp Electronic submission is highly preferred. Papers must be received by October 10, 1999. Notification of acceptance will be emailed to the first (or designated) author by December 15, 1999. Camera-ready copy of accepted papers will be due January 15, 2000. Format: The paper should consist of a cover page with title, authors' names, postal and e-mail addresses, an approximately 200 word summary, up to 5 keywords and a body not longer than ten pages with a single space. It is recommended that the authors use the style file of Springer-Verlag (http://www.springer.de/comp/lncs/authors.html) to minimize the possible conflict of paper length when preparing the camera ready. PAKDD Steering Committee: ========================= Xindong Wu, Colorado School of Mines, USA (Chair) Hongjun Lu, National University of Singapore (Co-Chair) Rao Kotagiri, University of Melbourne, Australia Huan Liu, National University of Singapore Hiroshi Motoda, Osaka University, Japan Lizhu Zhou, Tsinghua University, China Ning Zhong, Yamaguchi University, Japan Conference Chairs: ================== Masaru Kitsuregawa, University of Tokyo, Japan Hiroshi Motoda, Osaka University, Japan Program Chairs: =============== Takao Terano, Tsukuba University, Japan (Chair) Arbee L. P. Chen, National Tsing Hua University, Taiwan (Co-chair) Huan Liu, National University of Singapore, Singapore (Co-chair) Publicity Chair: =============== Shinichi Morishita, University of Tokyo, Japan Workshop Chair: =============== Takahira Yamaguchi, Shizuoka University, Japan Tutorial Chair: =============== Shusaku Tsumoto, Shimane Medical University, Japan Local Organizing Committee Chair: ============================ Shiro Takata, Keihanna Interaction Plaza, Inc., Japan Program Committee: (To be announced) =============== Further Information: ==================== Takao Terano (PAKDD-2000) Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan Tel.: +81-3-3942-6855 Fax.: +81-3-3942-6829 Email :terano at gssm.otsuka.tsukuba.ac.jp --------------------------------------------------------------- Call for Tutorial Proposals ??????????????????????????? PAKDD-2000 will offer a tutorial program on KDD topics. We would be able to present only a limited number of tutorials, and the selection would be guided by the perceived quality and relevance to the conference. If you are interested in giving a tutorial, please send a proposal to tsumoto at computer.org by Oct. 10, 1999. Call for Workshop Proposals ??????????????????????????? PAKDD-2000 will provide a venue for a few workshops to focus on advanced research areas of KDD. Please submit suggestions for workshop proposals to yamaguti at cs.inf.shizuoka.ac.jp by Oct. 10, 1999. Call for Panel Proposals ???????????????????????? Proposals are sought for panels that stimulate interaction between the communities contributing to KDD. Include title, the main goals, prospective participants and a summary of the topics to be discussed. Please email panel proposals to terano at gssm.otsuka.tsukuba.ac.jp by Oct. 10, 1999. Call for Exhibits and Industry Sessions ??????????????????????????????????????? PAKDD-2000 will organize a data mining, OLAP and data warehouse product exhibition. Please send proposals for exhibits to terano at gssm.otsuka.tsukuba.ac.jp by Oct. 10, 1999. -------------------------------------------------------------------- From tom at erato.atr.co.jp Tue Jul 13 06:51:12 1999 From: tom at erato.atr.co.jp (Tomohiro Shibata) Date: Tue, 13 Jul 1999 19:51:12 +0900 Subject: Humanoid Robot Demonstration Message-ID: <19990713195112G.tom@erato.atr.co.jp> The home page presenting information of a humanoid robot that we developed is available online at http://www.erato.atr.co.jp/DB/ The following draft is what we used for the press release on June 24, 1999. ------- Subject: Humanoid Robot Demonstration Lift the ban on this topic: June 24, 1999, 13:00pm Abstract: The Kawato Dynamic Brain Project, ERATO, JST introduces the HUMANOID ROBOT, a dextrous anthropomorphic robot that has the same kinematic structure as the human body with 30 active degrees of freedom (without fingers). We believe that employing a HUMANOID ROBOT is the first step towards a complete understanding of high-level functions of the brain by mathematical analysis. For demonstration purposes, the HUMANOID ROBOT performs the Okinawa folk dance "Kacha-shi" and learns human-like eye movements based on neurobiological theories. It is noteworthy that the acquisition of the Okinawa folk dance was achieved based on "learning from demonstration", which is in sharp contrast to the classic approach of manual robot programming. Learning from demonstration means learning by watching a demonstration of a teacher performing the task. In our approach to learning from demonstration, a reward function is learned from the demonstration, together with a task model that can be acquired from the repeated attempts to perform the task. Knowledge of the reward function and the task models allows the robot to compute an appropriate control mechanism. Over the last years, we have made significant progress in "learning from demonstration" such that we are able to apply the developed theories to the HUMANOID ROBOT. We believe that learning from demonstration will provide one of the most important footholds to understand the information processes of sensori-motor control and learning in the brain. We believe that the following three levels are essential for a complete understanding of brain functions: (a) hardware level; (b) information representation and algorithms; and (c) computational theory. We are studying high-level functions of the brain by utilizing multiple methods such as neurophysiological analysis of the Basal Ganglia and Cerebellum; psychophysical and behavioral analysis of visual motor learning; brain activity by fMRI study; mathematical analysis; computer simulation of neural networks, and robotics experiments using the HUMONOID ROBOT. For instance, in one of our approaches, we are trying to learn a Neural Network Model for Motor Learning with the HUMANOID ROBOT that includes data from psychophysical and behavioral experiments as well as data from brain activity from fMRI studies. The HUMANOID ROBOT reproduces a learned model in a real task, and we are able to verify the model by checking its robustness and performance. A lot of attention is being given on the study of brain functions using this new tool: the HUMANOID ROBOT. This should be a first important step towards changing the future of brain science.Date of press release: June 24, 1999 Organizer: JST Host Organizer: STA Style: Material along with lecture, videotape, and demonstration Place: Experiment room of the Kawato Dynamic Brain Project, Kyoto Chairman: Mitsuo Kawato, Project Director (one hour including questions.) -- SHIBATA, Tomohiro, Ph.D. | email: tom at erato.atr.co.jp Kawato Dynamic Brain Project | WWW: http://www.erato.atr.co.jp/~tom ERATO | Robotics in Japan page: JST | http://www.erato.atr.co.jp/~tom/jrobres.html From wray at EECS.Berkeley.EDU Tue Jul 13 21:36:01 1999 From: wray at EECS.Berkeley.EDU (Wray Buntine) Date: Tue, 13 Jul 1999 18:36:01 -0700 (PDT) Subject: research position on NASA-funded project, synthesis of learning alg. Message-ID: <199907140136.SAA21107@ic.EECS.Berkeley.EDU> Immediate opening for late summer position, full or half-time funded by NASA Ames Research Center for good masters, in-progress PhD or postdoc. American citizens or resident aliens only can apply. Introduction to the research for the project can be found in the groups pending KDD'99 paper available at: http://www-cad.eecs.berkeley.edu/~wray/kdd99.ps and relevance of broader research goals at: http://www-cad.eecs.berkeley.edu/~wray/Mirror/x2009.pdf This is a unique opportunity to work with leading experts in both data analysis and automated software engineering http://ic.arc.nasa.gov/ic/projects/amphion/index.html developing NASA's future intelligent systems. Position is short term and has potential for growth, and ideal for student seeking late summer work. The Task ======== Applicant should have knowledge of statistical algorithms for data analysis, and familiarity with their application and their implementation. Applicant will support NASA Ames' automated software engineering experts in their development of the system, and in testing the system, and co-develop the system with the data analysis expert. The System ========== Code synthesis is routinely used in industry to generate GUIs, for database support, and in computer-aided design, and has been successfully used in development projects for scheduling. Our research applies synthesis to statistical, data anlysis algorithms. For synthesis, we use a specification language that generalizes Bayesian networks, a dependency model on variables. Using decomposition methods and algorithm templates, our system transforms the network through several levels of representation into pseudo-code which can be translated into the implementation language of choice. Algorithm templates are been developed for a variety of sophisticated schemes including EM, mean-field, maximum aposterior, and iterative-reweighted least squares. The transformation system is based on a term-rewriting core with the capacity for symbolic simplification and differentiation of sums and products over vectors, matrices and delta functions. We are currently developing a back-end optimizer for Matlab. Requirements ============ Applicant should be a keen and experienced coder with a knowledge of the relevant data analysis algorithms and their implementation. Code is written in a mixture of Prolog (for the synthesis) and Lisp (for the back-end), and basic exposure to these languages is necessary. The back-end is Matlab so exposure here is useful too. Development on a Linux platform. Applicant should be willing to learn new languages/environments. Please contact Wray Buntine for further information at: wray at ic.eecs.berkeley.edu Applications ============ Email your resume or a URL to Wray Buntine at the above address. Microsoft Word documents definitely discouraged. No deadline but sooner better! From wahba at stat.wisc.edu Tue Jul 13 20:22:17 1999 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 13 Jul 1999 19:22:17 -0500 (CDT) Subject: gss software Message-ID: <199907140022.TAA24210@hera.stat.wisc.edu> The following software may be of interest to connectionists. It is for use with R, which is a public statistical package with many utilities of use to statisticians. The main R master site is http://www.ci.tuwien.ac.at/R/, a US mirror site is http://cran.stat.wisc.edu/, and the packages, including gss below can be found at http://www.ci.tuwien.ac.at/R/src/contrib/PACKAGES.html ............................ ------- Start of forwarded message ------- From chong at stat.purdue.edu Tue Jul 6 12:40:14 1999 From: chong at stat.purdue.edu (Chong Gu) Date: Tue, 6 Jul 1999 11:40:14 -0500 Subject: New smoothing spline package gss Message-ID: Dear fellow R users, I just uploaded a new package gss to ftp.ci.tuwien.ac.at. The package name gss stands for General Smoothing Spline. In the current version (0.4-1), it handles nonparametric multivariate regression with Gaussian, Binomial, Poisson, Gamma, Inverse Gaussian, and Negative Binomial responses. I am still working on code for density estimation and hazard rate estimation to be made available in future releases. On the modeling side, gss uses tensor-product smoothing splines to construct nonparametric ANOVA structures using cubic spline, linear spline, and thin-plate spline marginals. The popular (main-effect-only) additive models are special cases of nonparametric ANOVA models. The syntax of gss functions resembles that of the lm and glm suites. Among new features that are not available from other spline packages are the standard errors needed for the construction of Wahba's Bayesian confidence intervals for smoothing spline fits, so you may want to try out gss even if you only wants to calculate a univariate cubic spline or a single term thin-plate spline. For those familiar with smoothing splines, gss is a front end to RKPACK, which encodes O(n^3) generic algorithms for reproducing kernel based smoothing spline calculation. Reports on bugs and suggestions for improvements/new features are most welcome. Chong Gu -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-announce mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-announce-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._ From harnad at coglit.ecs.soton.ac.uk Wed Jul 14 13:24:52 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Wed, 14 Jul 1999 18:24:52 +0100 (BST) Subject: Neural Organization: BBS call for Multiple Review Message-ID: Below is the abstract of the Precis of a book that will shortly be circulated for Multiple Book Review in Behavioral and Brain Sciences (BBS): *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** PRECIS FOR Structure, Function, and Dynamics: An Integrated Approach to Neural Organization :BBS MULTIPLE BOOK REVIEW by Michael Arbib, Peter Erdi and John Szentagothai This book has been accepted for a muliple book review to be published in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be BBS Associates or nominated by a BBS Associate. (All prior BBS referees, editors, authors, and commentators are also equivalent to Associates.) To be considered as a reviewer for this article, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please send EMAIL to, BEFORE August 13, 1999: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of reviewers, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a reviewer. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. Please also specify 1) If you need the book 2) whether you can make it by the deadline of October 15, 1999. Please note that it is the book, not the Precis, that is to be reviewed. It would be helpful if you indicated in your reply whether you already have the book or would require a copy. _____________________________________________________________ PRECIS OF: Structure, Function, and Dynamics: An Integrated Approach to Neural Organization BBS MULTIPLE BOOK REVIEW Michael Arbib Director, USC Brain Project, University of Southern California Los Angeles, CA 90089-2520 USA. Arbib at pollux.usc.edu Peter Erdi Head, Dept. Biophysics KFKI Research Institute for Particle and Nuclear Physics of the Hungarian Academy of Sciences H-1525 Budapest, P.O. Box 49, Hungary. erdi at rmki.kfki.hu ABSTRACT: "Neural Organization: Structure, Function, and Dynamics" (Arbib, Erdi, and Szentagothai, 1997, Cambridge, MA: The MIT Press; henceforth Organization) shows how theory and experiment can supplement each other in an integrated, evolving account of structure, function, and dynamics. New data lead to new models; new models suggest the design of new experiments. Much of modern neuroscience seems excessively reductionist, focusing on the study of ever smaller microsystems with little appreciation of their contribution to the behaving organism. We welcome these new data but are concerned to restore some equilibrium between systems, cellular, and molecular neuroscience. After a brief tribute to our late colleague John Szentagothai, we trace the threads of Structure, Function and Dynamics as they weave through the book, thus providing a broad general framework for the integration of computational and empirical neuroscience. Part II of Organization presents a structural analysis of various brain regions - olfactory bulb and cortex, hippocampus, cerebral cortex, cerebellum, and basal ganglia - as prelude to our account of the dynamics of the neural circuits and function of each region. To exemplify this approach, this prcis analyzes the hippocampus in anatomical, dynamical, and functional terms. We conclude by pointing the way to the use of our methodology in the development of Cognitive Neuroscience. KEYWORDS: neural organization, dynamics, Szentgothai, computational neuroscience, neural modeling, modular architectonics, neural plasticity, hippocampus, rhythmogenesis, cognitive maps, memory. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.arbib.html ------------------------------------------------------------------ *** FIVE IMPORTANT ANNOUNCEMENTS *** (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From hali at theophys.kth.se Wed Jul 14 19:01:22 1999 From: hali at theophys.kth.se (Hans Liljenstrm) Date: Thu, 15 Jul 1999 01:01:22 +0200 Subject: Registration for Agora'99 Message-ID: <378D16C1.403F7A06@theophys.kth.se> ***************************************************************************** REGISTRATION INFORMATION for 1999 Agora Meeting on Fluctuations in Biological Systems (Agora'99) August 3-7, 1999, Sigtuna, Sweden Information and registration at http://www.theophys.kth.se/~hali/agora/agora99 ***************************************************************************** SCOPE This interdiscplinary conference on fluctuations in biological systems will be held in the small old town of Sigtuna, Sweden, Aug 3-7, 1999, and is following upon a series of workshops, where the first was held in Sigtuna, Sep 4-9 1995 (Sigtuna Workshop 95). The approach on these meetings is theoretical as well as experimental, and the meetings are intended to attract participants from various fields, such as biology, physics, and computer science. MOTIVATION Life is normally associated with a high degree of order and organization. However, disorder ? in various contexts referred to as fluctuations, noise or chaos ? is also a crucial component of many biological processes. For example, in evolution random errors in the reproduction of the genetic material provides a variation that is fundamental for the selection of adaptive organisms. At a molecular level, thermal fluctuations govern the movements and functions of the macromolecules in the cell. Yet, it is also clear that too large a variation may have disastrous effects. Uncontrolled processes need stabilizing mechanisms. More knowledge of the stability requirements of biological processes is needed in order to better understand these problems, which also have important medical applications. Many diseases, for instance certain degenerations of brain cells, are caused by failure of the stabilizing mechanisms in the cell. Stability is also important and difficult to achieve in biotechnological applications. There is also randomness in structure and function of the neural networks of the brain. Spontaneous firing of neurons seems to be important for maintaining an adequate level of activity, but does this "neuronal noise" have anyother significance? What are the effects of errors and fluctuations in the information processing of the brain? Can these microscopic fluctuations be amplified to provide macroscopic effects? Often, one cannot easily determine whether an apparently random process is due to noise, governed by uncontrolled degrees of freedom, or if it is a result of "deterministic chaos". Would the difference be of any importance for biology? Especially, could chaos, which is characterized by sensitivity and divergence, be useful for any kind of information processing that normally depends upon stability and convergence? OBJECTIVE The objective of this meeting is to address questions and problems related to those above, for a deeper understanding of the effects of disorder in biological systems. Fluctuations and chaos have been extensively studied in physics, but to a much lesser degree in biology. Important concepts from physics, such as "noise-induced state transitions" and "controlled chaos" could also be of relevance for biological systems. Yet, little has been done about such applications and a more critical analysis of the positive and negative effects of disorder for living systems is needed. It is essential to make concrete and testable hypotheses, and to avoid the kind of superficial and more fashionable treatment that often dominates the field. By bringing together scientists with knowledge and insights from different disciplines we hope to shed more light on these problems, which we think are profound for understanding the phenomenon of life. TOPICS Topics include various aspects, experimental as well as theoretical, on fluctuations, noise and chaos, in biological systems at a microscopic (molecular), mesoscopic (cellular), and macroscopic (network and systems) level. Contributions are welcome regarding, among others, the following topics: - Biological signals and noise - Neural information processing - Synaptic fluctuations - Spontaneous neural firing - Macromolecular dynamics - Dynamics of microtubuli - Ion channel kinetics - Cell motility - Medical implications INVITED SPEAKERS Sergey Bezrukov, National Institute of Health, Bethesda, USA Hans Braun, Dept. of Physiology, University of Marburg, Germany Anders Ehrenberg, Dept. of Biophysics, Stockholm University, Sweden Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA Louis De Felice, Vanderbilt University, USA Hermann Haken, Institute of Theoretical Physics and Synergetics, Univ. of Stuttgart, Germany Uno Lindberg, Dept. of Cell Biology, Stockholm University, Sweden Matsuno, Department of Biomedical Engineering, Nagaoka Univ. of Technology, Japan Frank Moss, Dept. of Physics, University of Missouri, St Louis, USA Erik Mosekilde, Dept of Physics, Technical University of Denmark, Lyngby, Denmark Sakire P?gun, Center for Brain Research, Ege University, Turkey Rudolf Rigler, Dept. of Medical Biochemistry and Biophysics, Karolinska Institutet, Sweden Stephen Traynelis, Dept. of Pharmacology and Physiology, Emory University, Georgia, USA Horst Vogel, Swiss Federal Institute of Technology, Lausanne, Switzerland Jim J. Wright, Mental Health Research Institute of Victoria, Melbourne, Australia Michail Zhadin, Dept. of Physiology, Pushchino, Russia REGISTRATION and abstract submission can preferably be done via the Agora'99 home page: http://www.theophys.kth.se/~per/form REGISTRATION FEES Regular: 2000 SEK Students: 1000 SEK FURTHER INFORMATION available from: Hans Liljenstrom or Peter Arhem Agora for Biosystems Box 57, SE-193 22 Sigtuna, Sweden Phone/Fax: +46-8-592 50901 Email: peter.arhem at neuro.ki.se, hali at theophys.kth.se WWW: http://www.theophys.kth.se/~hali/agora From shastri at ICSI.Berkeley.EDU Wed Jul 14 21:55:34 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Wed, 14 Jul 1999 18:55:34 PDT Subject: Structured connectionism, temporal synchrony, and evidential reasoning Message-ID: <199907150155.SAA20814@lassi.ICSI.Berkeley.EDU> Dear Connectionists: The following paper may be of interest to you. It demonstrates that a fusion of structured connectionism, temporal synchrony, evidential reasoning, and fast synaptic potentiation can lead to a neural network capable of rapidly establishing causal and referential coherence. Best wishes. -- Lokendra Shastri --------------------------------------- http://www.icsi.berkeley.edu/~shastri/psfiles/fusion99.ps OR http://www.icsi.berkeley.edu/~shastri/psfiles/fusion99.pdf Title: Knowledge Fusion in the Large -- taking a cue from the brain Lokendra Shastri and Carter Wendelken International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 In Proceedings of the Second International Conference on Information Fusion, Sunnyvale, CA, July 1999, pp. 1262-1269. From rog1 at psu.edu Thu Jul 15 08:25:57 1999 From: rog1 at psu.edu (Rick Gilmore) Date: Thu, 15 Jul 1999 08:25:57 -0400 Subject: Senior-level faculty position in neuroscience Message-ID: SENIOR-LEVEL NEUROSCIENTIST. As part of a major university-wide expansion in the life sciences, the Department of Psychology at Penn State University announces a search in cognitive, computational, or behavioral neuroscience. Senior-level candidates who wish to play a leadership role in building Penn State's growing programs in neuroscience are encouraged to apply. Send a statement of research and teaching interests, vita, and recent reprints to: Neuroscience Search, 617 Moore Building, Department of Psychology, University Park, PA 16802. The Pennsylvania State University is an equal opportunity employer. From tlfine at ANISE.EE.CORNELL.EDU Thu Jul 15 20:09:42 1999 From: tlfine at ANISE.EE.CORNELL.EDU (Terrence Fine) Date: Thu, 15 Jul 1999 20:09:42 -0400 (EDT) Subject: new book, Feedforward Neural Network Methodology Message-ID: A new monograph, Feedforward Neural Network Methodology, by Terrence L. Fine, has appeared in the Springer Series on Statistics for Engineering and Information Science. It is priced at $69.95 for 340 pages. This monograph provides a thorough and coherent introduction to the mathematical properties of feedforward neural networks and to the computationally intensive methodology that has enabled their successful application to complex problems of pattern classification, forecasting, regression, and nonlinear systems modeling. Coherence is achieved by focusing on the class of feedforward neural networks, also called multilayer perceptrons, and orienting the discussion around the four questions: What functions can the network architecture implement or closely approximate? What is the complexity of an achievable implementation? Given the resources dictated by the complexity considerations, how do you select a network to achieve the task? How well will the selected network learn or generalize to new problem instances? Table of Contents Preface Chapter 1: Background and Organization Chapter 2: Perceptrons---Networks with a Single Node Chapter 3: Feedforward Networks I: Generalities and LTU Nodes Chapter 4: Feedforward Networks II: Real-valued Nodes Chapter 5: Algorithms for Designing Feedforward Networks Chapter 6: Architecture Selection and Penalty Terms Chapter 7: Generalization and Learning Appendix: A Note on Use as a Text References Index From achilles at uom.gr Fri Jul 16 04:50:54 1999 From: achilles at uom.gr (Achilles D. Zapranis) Date: Fri, 16 Jul 1999 11:50:54 +0300 Subject: new book Message-ID: <003d01becf68$51687060$cf535cc1@uom.gr> New book (monograph): "Principles of Neural Model Identification, Selection and Adequacy - With Applications to Financial Econometrics" Springer - Verlag ISBN 1-85233-139-9 Achilleas Zapranis and Apostolos-Paul Refenes Neural networks are receiving much attention because of their powerful universal approximation properties. They are essentially devices for non-parametric statistical inference, providing an elegant formalism for unifying different non-parametric paradigms, such as nearest neighbours, kernel smoothers, projection pursuit, etc. Neural networks have shown considerable successes in a variety of disciplines ranging from engineering, control, and financial modelling. However, a major weakness of neural modelling is the lack of established procedures for performing tests for misspecified models and tests of statistical significance for the various parameters that have been estimated. This is a serious disadvantage in applications where there is a strong culture for testing not only the predictive power of a model or the sensitivity of the dependent variable to changes in the inputs but also the statistical significance of the finding at a specified level of confidence. This is very important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. In this book we investigate a broad range of issues arising with relation to their use as non-parametric statistical tools, including controlling the bias and variance parts of the estimation error, eliminating parameter and explanatory-variable redundancy, assessing model adequacy and estimating sampling variability. Based upon the latest, most significant developments in estimation theory, model selection and the theory of misspecified models this book develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework and displays through a selected case study and examples the efficient use of neural networks for modelling complex financial phenomena. The majority of existing books on neural networks and their application to finance concentrate on some of intricate algorithmic aspects of neural networks, the bulk of which is irrelevant to practitioners in this field. They use terminology, which is incomprehensible to professional financial engineers, statisticians and econometricians who are the natural readership in this subject. Neural networks are essentially statistical devices for non-linear, non-parametric regression analysis, but most of the existing literature discuss neural networks as a form of artificial intelligence. In our opinion this work meets an urgent demand for a textbook illustrating how to use neural networks in real-life financial contexts and provide methodological guidelines on how to develop robust applications which work from a platform of statistical insight. Contents: 1 INTRODUCTION 1.1 Overview 1.2 Active Asset Management, Neural Networks and Risk 1.2.1 Factor Analysis 1.2.2 Estimating Returns 1.2.3 Portfolio Optimisation 1.3 Non-Parametric Estimation with Neural Networks 1.3.1 Sources of Specification Bias 1.3.2 Principles of Neural Models Identification 1.4 Overview of the Remaining Chapters 2 NEURAL MODEL IDENTIFICATION 2.1 Overview 2.2 Neural Model Selection 2.2.1 Model Specification 2.2.2 Fitness Criteria 2.2.3 Parameter Estimation Procedures 2.2.4 Consistency and the Bias-Variance Dilemma 2.3 Variable Significance Testing 2.3.1 Relevance Quantification 2.3.2 Sampling Variability Estimation 2.3.3 Hypothesis Testing 2.4 Model Adequacy Testing 2.5 Summary 3 Review of Current Practice in Neural Model Identification 3.1 Overview 3.2 Current Practice in Neural Model Selection 3.2.1 Regularisation 3.2.2 Topology-Modifying Algorithms 3.2.3 The Structural Risk Minimisation (SRM) Principle 3.2.4 The Minimum Description Length (MDL) Principle 3.2.5 The Maximum a-Posteriori Probability (MAP) Principle 3.2.6 The Minimum Prediction Risk (MPR) Principle 3.3 Variable Significance Testing 3.3.1 Common Relevance Criteria 3.3.1.1 Criteria based on the Derivative dy/dx 3.3.1.2 Alternative Criteria 3.3.1.3 Comparing between Different Relevance Criteria 3.3.2 Sampling Variability and Bias Estimation with Bootstrap 3.3.2.1 Pairs Bootstrap 3.3.2.2 Residuals Bootstrap 3.3.2.3 Bias Estimation 3.3.3 Hypothesis Tests for Variable Selection 3.4 Model Adequacy Testing : Misspecification Tests 3.5 Summary 4 NEURAL MODEL SELECTION : THE MINIMUM PREDICTION RISK PRINCIPLE 4.1 Overview 4.2 Algebraic Estimation of Prediction Risk 4.3 Estimation of Prediction Risk with Resampling Methods 4.3.1 The Bootstrap and Jack-knife Methods for Estimating Prediction Risk 4.3.2 Cross-Validatory Methods for Estimating Prediction Risk 4.4 Evaluation of Model Selection Procedures 4.4.1 Experimental Set-Up 4.4.2 Algebraic Estimates 4.4.3 Bootstrap Estimates 4.4.4 Discussion, 103 4.5 Summary 5 VARIABLE SIGNIFICANCE TESTING : A STATISTICAL APPROACH 5.1 Overview 5.2 Relevance Quantification 5.2.1 Sensitivity Criteria 5.2.2 Model-Fitness Sensitivity Criteria 5.2.2.1 The Effect on the Empirical Loss of a Small Perturbation of x 5.2.2.2 The Effect on the Empirical Loss of Replacing x by its Mean 5.2.2.3 Effect on the Coefficient of Determination of a Small Perturbation of x 5.3 Sampling Variability Estimation 5.3.1 Local Bootstrap for Neural Models 5.3.2 Stochastic Sampling from the Asymptotic Distribution of the Network=92s Parameters (Parametric Sampling) 5.3.3 Evaluation of Bootstrap Schemes for Sampling Variability Estimation 5.3.3.1 Example 1 : The Burning Ethanol Sample 5.3.3.2 Example 2 : Wahba=92s Function 5.3.3.3 Example 3 : Network Generated Data 5.4 Hypothesis Testing 5.4.1 Confidence Intervals 5.4.2 Evaluating the Effect of a Variable=92s Removal 5.4.3 Variable Selection with Backwards Elimination 5.5 Evaluation of Variable Significance Testing 5.6 Summary 6 MODEL ADEQUACY TESTING 6.1 Overview 6.2 Testing for Serial Correlation in the Residuals 6.2.1 The Correlogram 6.2.2 The Box-Pierce Q-Statistic 6.2.3 The Ljung-Box LB-Statistic 6.2.4 The Durbin-Watson Test 6.3 An F-test for Model Adequacy 6.4 Summary 7 NEURAL NETWORKS IN TACTICAL ASSET ALLOCATION 7.1 Overview 7.2 Quantitative Models for Tactical Asset Allocation 7.3 Data Pre-Processing 7.4 Forecasting the Equity Premium with Linear Models 7.4.1 Model Estimation 7.4.2 Model Adequacy Testing 7.4.2.1 Testing the Assumptions of Linear Regression 7.4.2.2 The Effect of Influential Observations 7.4.3 Variable Selection 7.5 Forecasting the Equity Premium with Neural Models 7.5.1 Model Selection and Adequacy Testing 7.5.2 Variable Selection 7.5.2.1 Relevance Quantification 7.5.2.2 Sampling Variability Estimation 7.5.2.3 Backwards Variable Elimination 7.6 Comparative Performance Evaluation 7.7 Summary 9 CONCLUSIONS Appendix I : Computing Network Derivatives Appendix II : Generating Random Deviates Bibliography --------------------------------------------------------------------------------------------- Dr Achilleas D. Zapranis University of Macedonia of Economic and Social Sciences 156 Egnatia St. PO BOX 1591, 540 06 Thessaloniki Greece Tel. 00-31-(0)31-891690, Fax 00-30-(0)31-844536 e-mail : achilles at macedonia.uom.gr -------------------------------------------------------------------------------------------- From axon at cortex.rutgers.edu Fri Jul 16 09:53:01 1999 From: axon at cortex.rutgers.edu (Ralph Siegel) Date: Fri, 16 Jul 1999 09:53:01 -0400 Subject: Postdoctoral positions - visual system Message-ID: <001801becf92$850ab980$2a660680@stp.rutgers.edu> Postdoctoral Trainee (two positions). Analysis of visual structure-from-motion in primates. Representation of optic flow and attention in temporal (STPa) and parietal lobes (7a) are being examined in the awake behaving monkey. These studies utilize single unit recording, optical recording, microdialysis techniques. Laboratory also has ongoing fMRI, human psychophysical studies and computational studies. Recent graduates who are changing fields from either cellular or computational neuroscience to behavioral and physiological studies are particularly encouraged to apply. Computer expertise useful, but not necessary. Superb experimental and computational facilities in a multi-disciplinary research center. NY-NJ Metro area. Contact: Ralph Siegel, Ph.D. Center for Molecular and Behavioral Neuroscience Rutgers, The State University 197 University Avenue Newark, NJ 07102 phone: 973-353-1080 x3261 fax: 973-353-1272 axon at cortex.rutgers.edu http://www.cmbn.rutgers.edu/cmbn/faculty/rsiegel.html Term: Beginning July 1, 1999 Salary: NIH levels Please send statement of research interests, curriculum vitae, and names of three references. From ted.carnevale at yale.edu Sat Jul 17 18:07:12 1999 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sat, 17 Jul 1999 18:07:12 -0400 Subject: URL for NEURON at Yale Message-ID: <3790FE90.7A332295@yale.edu> NEURON's WWW site at Yale has been reorganized. As a consequence, the sole valid URL for NEURON's home page at Yale is http://www.neuron.yale.edu ^^^^^^ The URL of the Duke site remains http://neuron.duke.edu as before. Please excuse me if you also receive this notice from some other source. --Ted From terry at salk.edu Tue Jul 20 17:19:27 1999 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 20 Jul 1999 14:19:27 -0700 (PDT) Subject: NEURAL COMPUTATION 11:6 Message-ID: <199907202119.OAA07748@helmholtz.salk.edu> Neural Computation - Contents - Volume 11, Number 6 - August 15, 1999 VIEW Seeing White: Qualia In The Context of Decoding Population Codes Sidney R. Lehky and Terrence J. Sejnowski NOTE Adaptive Calibration of Imaging Array Detectors Marco Budinich and Renato Frison LETTERS Modeling The Combination Of Motion, Stereo, And Vergence Angle Cures To Visual Depth I. Fine and Robert A. Jacobs Inferring The Features Of Recognition From Viewoint-Dependent Recognition Data Florin Cutzu and Michael Tarr Backprojections In The Cerebral Cortex: Implications For Memory Storage Alfonso Renart, Nestor Parga and Edmund T. Rolls The Relationship Between Synchronization Among Neuronal Populations and Their Overall Activity Levels D. Chawla, E. D. Lumer, K. J. Friston Fast Calculation Of Short-Term Depressed Synaptic Conductances Michele Giugliano, Marco Bove and Massimo Grattarola Algorithmic Stability And Sanity-Check Bounds For Leave-One-Out Cross-Validation Michael Kearns and Dana Ron Convergence Properties Of The Softassign Quadratic Assignment Algorithm Anand Rangarajan, Alan Yuille, and Eric Mjolsness Learning to Design Synergetic Computers with an Expanded Synmetric Diffusion Network Koji Okuhara, Shonji Osaki, and Masaaki Kijima ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From juergen at idsia.ch Mon Jul 19 10:24:48 1999 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Mon, 19 Jul 1999 16:24:48 +0200 Subject: predictability minimization Message-ID: <199907191424.QAA13716@ruebe.idsia.ch> Processing Images by Semi-Linear Predictability Minimization ------------------------------------------------------------ Nicol N. Schraudolph, Martin Eldracher & Juergen Schmidhuber IDSIA, Lugano, Switzerland www.idsia.ch Network: Computation in Neural Systems, 10(2):133-169, 1999 In the predictability minimization approach (Neural Computation, 4(6):863-879, 1992), input patterns are fed into a system consisting of adaptive, initially unstructured feature detectors. There are also adaptive predictors constantly trying to predict current feature detector outputs from other feature detector outputs. Simultaneously, however, the feature detectors try to become as unpredictable as possible, resulting in a co-evolution of predictors and feature detectors. This paper describes the implementation of a visual processing system trained by semi-linear predictability minimization, and presents many experiments that examine its response to artificial and real-world images. In particular, we observe that under a wide variety of conditions, predictability minimization results in the development of well-known visual feature detectors. ftp://ftp.idsia.ch/pub/juergen/pm.ps.gz Many additional papers now also available in postscript format: http://www.idsia.ch/~juergen/onlinepub.html Nici & Martin & Juergen From michael at cs.unm.edu Thu Jul 22 09:59:12 1999 From: michael at cs.unm.edu (Zibulevsky Michael) Date: Thu, 22 Jul 1999 07:59:12 -0600 Subject: new paper: Blind Source Separation by Sparse Decomposition Message-ID: Announcing a new paper.... Title: Blind Source Separation by Sparse Decomposition Authors: Michael Zibulevsky and Barak A. Pearlmutter Abstract The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding (possibly overcomplete) signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more robust computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques. URL of the ps file: http://iew3.technion.ac.il:8080/~mcib/ Contact: michael at cs.unm.edu, bap at cs.unm.edu From shs7 at cornell.edu Thu Jul 22 14:43:14 1999 From: shs7 at cornell.edu (Steven Strogatz) Date: Thu, 22 Jul 1999 14:43:14 -0400 Subject: visiting position in Nonlinear Systems, Cornell Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 2491 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/ea156b90/attachment.bin From chella at unipa.it Fri Jul 23 06:20:06 1999 From: chella at unipa.it (Antonio Chella) Date: Fri, 23 Jul 1999 12:20:06 +0200 Subject: INTERNATIONAL SCHOOL ON NEURAL NETS <> Message-ID: <379841D4.CA914FDD@unipa.it> ============================================================= INTERNATIONAL SCHOOL ON NEURAL NETS <> 4th Course: Subsymbolic Computation in Artificial Intelligence ERICE-SICILY: October 24-31, 1999 ============================================================= MOTIVATIONS Autonomous intelligent agents that perform complex real world tasks must be able to build and process rich internal representations that allow them to effectively draw inferences, make decisions, and, in general, perform reasoning processes concerning their own tasks. Within the computational framework of artificial intelligence (AI) this problem has been faced in different ways. According to the classical, symbolic approach, internal representations are conceived in terms of linguistic structures, as expressions of a "language of thought". Other traditions developed approaches that are less linguistically oriented, and more biologically and anatomically motivated. It is the case of neural networks, and of self-organizing and evolutionary algorithms. Empirical results concerning natural intelligent systems suggest that such approaches are not fully incompatible, and that different kinds of representation may interact. Similarly, it can be argued that the design of artificial intelligent systems can take advantage from different kinds of interacting representations, that are suited for different tasks. In this perspective, theoretical frameworks and methodological techniques are needed, that allow to employ together in a principled way different kinds of representation. In particular, autonomous agents need to find the meaning for the symbols they use within their internal processes and in the interaction with the external world, thus overcoming the well-known symbol grounding problem. An information processing architecture for autonomous intelligent agents should exhibit processes that act on suitable intermediate levels, which are intermediary among sensory data, symbolic level, and actions. These processes could be defined in terms of subsymbolic computation paradigms, such as neural networks, self- organizing, and evolutionary algorithms. DIRECTOR OF THE COURSE: Salvatore Gaglio DIRECTOR OF THE SCHOOL: M.I.Jordan - M.Marinaro DIRECTOR OF THE CENTRE: A. Zichichi SCIENTIFIC SECRETARIAT: Edoardo Ardizzone, Antonio Chella, Marcello Frixione WEB PAGE OF THE COURSE: http://dijkstra.cere.pa.cnr.it/ScuolaErice/ ============================================================= SPONSORS + ASEIT Advanced School on Electronics and Information Technology + AI*IA Associazione Italiana per l'Intelligenza Artificiale (Italian Association for Artificial Intelligence) + CERE - CNR CEntro di Studio sulle Reti di Elaboratori (Center of Study on Computer Networks) + CITC Centro Interdipartimentale sulle Tecnologie della Conoscenza + Commission of the European Communities + IEEE Neural Network Council + IIASS Istituto Internazionale per gli Alti Studi Scientifici (International Institute for Advanced Scientific Studies) + Ministero Italiano della Pubblica Istruzione (Italian Ministry of Education) + MURST Ministero dell'Universita' e della Ricerca Scientifica e Tecnologica (Italian Ministry of University and Scientific Research) + CNR Consiglio Nazionale della Ricerca (Italian National Research Council) + Regione Siciliana (Sicilian Regional Government) + SIREN Societa' Italiana Reti Neuroniche (Italian Neural Networks Society) + University of Genoa + University of Palermo + University of Salermo ============================================================= PROGRAM FOUNDATIONS Introduction to Artificial Intelligence: Luigia Carlucci Aiello (University of Roma, "La Sapienza", Italy) Neural modelling of higher order cognitive processes: John Taylor (King's College, London, UK) Connectionist Models for Data Structures: Marco Gori (University of Siena, Italy) Neural Systems Engineering: Igor Aleksander (Imperial College, London, UK) ASEIT (Advanced School on Electronics and Information Technology) OPEN INTERNATIONAL WORKSHOP ON SUBSYMBOLIC TECHNIQUES AND ALGORITHMS Peter G=E4rdenfors (Lund University, Sweden) Igor Aleksander (Imperial College, London, UK)) Teuvo Kohonen (Helsinki University of Technology, Finland) John Taylor (King's College, London, UK) Ronald Arkin (Georgia Institute of Technology, USA) REPRESENTATION Conceptual Spaces: Peter G=E4rdenfors (Lund University, Sweden) Topological Self Organizing Maps: Teuvo Kohonen (Helsinki University of Technology, Finland) Symbolic representations: Luigia Carlucci Aiello (University of Roma, =ECLa Sapienza=EE, Italy) VISUAL PERCEPTION Evolutionary Processes for Artificial Perception: Giovanni Adorni, Stefano Cagnoni (University of Parma, Italy) Cognitive Architectures for Artificial Vision: Antonio Chella (Univ. of Palermo, Italy), Marcello Frixione (Univ. of Salerno, Italy), Salvatore Gaglio (Univ. of Palermo, Italy) Algorithms for Image Analysis: Vito Di Ges=F9 (Univ. of Palermo, Italy) ACTION Motion Maps: Pietro Morasso (University of Genoa, Italy) Interacting with the External World: Luc Steels (Free University of Brussels, Belgium) Reinforcement Learning in Autonomous Robots: Christian Balkenius (Lund University, Sweden) Behavior-Based Robotics: Ronald Arkin (Georgia Institute of Technology, USA) ============================================================= APPLICATIONS Interested candidates should send a letter to the Director of the Course: Professor Salvatore GAGLIO Dipartimento di Ingegneria Automatica e Informatica Universita' di Palermo Viale delle Scienze 90128 - PALERMO - ITALY Tel: ++39.091.238245 Fax: ++39.091.6529124 E-mail: gaglio at unipa.it They should specify: 1.date and place of birth, together with present nationality; 2.affiliation; 3.address, e-mail address. Please enclose a letter of recommendation from the group leader or the Director of the Institute or from a senior scientist. PLEASE NOTE Participants must arrive in Erice on October 24, not later than 5:00 pm. ============================================================= From pg at artificial-life.com Sun Jul 25 11:20:36 1999 From: pg at artificial-life.com (Paolo Gaudiano) Date: Sun, 25 Jul 1999 11:20:36 -0400 Subject: Call for Participation Message-ID: <000901bed6b1$4006de60$4ca771d1@kmdg.cove.com> First USA-Italy Conference on Applied Neural and Cognitive Sciences Boston, October 3-6, 1999 The Italian Ministry of Foreign Affairs, through the Embassy of Italy in Washington, has identified Applied Neural and Cognitive Sciences as an area of strategic interest for research and development in Italy over the next 10 to 20 years. UI-CANCS'99, which is sponsored jointly by the Embassy of Italy and by Artificial Life, Inc., compares and contrasts industry and research experiences in Italy and in the United States. UI-CANCS'99 focuses on smart sensors, intelligent robotics, and related technologies with strong potential for industrial, medical and other applications. Invited presentations will define the current state-of-the-art and future developments in these fields, they will compare and contrast scientific and technological resources in the USA and Italy, and they will try to identify the most promising and = effective sectors in Italian industries and universities to be developed and supported over the next 10 to 20 years. UI-CANCS'99 offers a unique opportunity to present and discuss a broad range of scientific and industrial projects within the context of applied neural and cognitive sciences. The Conference also aims at strengthening collaborations between the USA and Italy, and to develop new joint projects. We hope that this will be the first in a series of international meetings held regularly to monitor progress in these fields, to identify areas in need of further development support, and to offer a meeting point between Italian and American members of industry, academia and other organizations. For additional information please visit http://www.usa-italy.org or send e-mail to info at usa-italy.org. ---------------------------------------------------------------------- Conference dates: Sunday, October 3 through Tuesday, October 6, 1999. These dates include a welcoming reception on Sunday evening and an optional visit to local laboratories in the Boston area on Wednesday, October 6. Invited talks will be given all day Monday and Tuesday morning, while a round-table discussion will be held on Tuesday afternoon. Location: George Sherman Union, Boston University, 775 Commonwealth Avenue, Boston, Massachusetts, USA. Organizing Committee: Daniele Amati, SISSA Emilio Bizzi, MIT Paolo Gaudiano, Artificial Life, Inc. Alexander Tenenbaum, Embassy of Italy Official Language: English Presentations: All talks are by invitation. Registration: See registration form at end. ---------------------------------------------------------------------- Preliminary Program SUNDAY, OCTOBER 3, 1999 6.00-9.00pm Registration; Welcoming Reception (with full registration) MONDAY, OCTOBER 4, 1999 8.00-9.00am Registration; Breakfast (with full registration) 9.00-9.45am Opening Remarks 9.45am-1.00pm (with 15-min break at 11:15am) Session 1, Smart Sensors 1.00-3.45pm Lunch 3.45-7.00pm (with 15-min break at 5.15pm) Session 2, Intelligent Robotics and Software Agents=20 8.00pm Official dinner and plenary talk (with full registration) TUESDAY, OCTOBER 5, 1999 8.00-9.00am Breakfast (with full registration) 9.00am-12.15pm (with 15-min break at 11.15am) Session 3, Biomedical Applications 12.15-1.15pm Session 4.P1, Relationship Between Industry and Research 1.15-3.45pm Lunch 3.45-5.15pm Session 4.P2, Relationship Between Industry and Research 5.30-7.30pm Round-table discussion. WEDNESDAY, OCTOBER 6, 1999 9.00am-4.00pm Tour of local labs and companies (with paid registration) ---------------------------------------------------------------------- INVITED SPEAKERS Daniele Amati, International School for Advanced Studies, Trieste, Italy Emilio Bizzi, Massachusetts Institute of Technology, Cambridge, USA Nunzio Bonavita, ABB Elsag Bailey Hartmann & Braun S.p.A., Genova, Italy Paolo Dario, Scuola Superiore Sant'Anna, Pisa, Italy Danilo De Rossi, Universita` degli studi di Pisa, Italy Arnaldo D'Amico, Universita` degli Studi di Roma "Tor Vergata", Italy Marco Dorigo, Universite' Libre de Bruxelles, Belgium Paolo Gaudiano, Artificial Life, Inc., Boston, USA Helen Greiner, IS Robotics Corporate, Somerville, USA Stephen Grossberg, Boston University, USA Barbara Hayes-Roth, Stanford University and Extempo, Inc., Redwood City, = USA Pietro Morasso, Universita` degli studi di Genova, Italy Riccardo Parenti, Ansaldo Ricerche Automation Group, Genova, Italy Renato Nobili, Universita` degli Studi di Padova, Italy Stefano Nolfi, Centro Nazionale delle Ricerche, Roma, Italy Antonio Pedotti, Politecnico di Milano, Italy Tomaso Poggio, Massachusetts Institute of Technology, Cambridge, USA Eberhard Schoeneburg, Artificial Life, Inc., Boston, USA David Walt, Tufts University, Medford, USA John Wyatt, Massachusetts Institute of Technology, Cambridge, USA ---------------------------------------------------------------------- UI-CANCS'99 REGISTRATION FORM Fill out this form and e-mail it to info at usa-italy.org, or fax it to: UI-CANCS'99 Registration (+1) 508 624 6097 Seating will be limited, so please register as soon as possible=20 ..................................................................... First Name: Last Name: E-mail address: Affiliation: Telephone: Fax: Complete Mailing Address: Registration type (see notes 1-3 below): __Talks only (regular) US$100 __Talks only (student) US$30 __Full (regular) US$200 __Full (student) US$75 Credit card information: Card type (check one): __VISA __MC __AMEX Card number: Name on card: Expiration date: Optional information (see notes 4-5 below): * I plan to attend the optional lab tour on October 6 (Full registrants only): __Yes __No * I would like to be considered for partial support through=20 a travel grant: __Yes __No ..................................................................... NOTES: 1. "Talks only" includes attendance to all invited sessions, printed program, badge, and refreshments at all coffee breaks.=20 2. Full registration includes attendance to all invited sessions, printed program, badge, and refreshment at all coffee breaks, pre-conference evening reception on October 3, breakfast and lunch on October 4 and 5, Banquet and plenary talk on October 4, and optional tour of Boston-area labs on October 6.=20 3. Students (up to PhD level only) must attach letter from their advisor certifying their student status.=20 4. Lab tours take place on October 6. Space is limited so attendance cannot be guaranteed. Please check www.usa-italy.org for additional information as it becomes available. 5. Travel grants for partial support of travel expense might be available to a limited number of participants. Priority is given to students and to participants coming from abroad. If you check this option, you will receive e-mail with additional information.=20 6. For additional information please e-mail info at usa-italy.org or call 800-701-4741 (+1-508-624-5545 from outside the US). From ericwan at ece.ogi.edu Mon Jul 26 16:48:40 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Mon, 26 Jul 1999 13:48:40 -0700 Subject: postdoctoral positions available Message-ID: <379CC9A8.356BA77A@ece.ogi.edu> ***************** Post-Doctoral Research Associate **************** OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an an immediate opening for a motivated Ph.D. to participate in an interdisciplinary autonomous controls project. Initial appointment is for a one-year period, with the possibility for extension depending on performance and availability of funding. This is a joint project with the Electrical and Computer Engineering Dept, the Center for Coastal and Land-Margin Research (CCALMR, http://www.ccalmr.ogi.edu) and the Pacific Software Research Center (PacSoft, http://www.cse.ogi.edu/PacSoft/). SOFTWARE ENABLED CONTROL PROJECT OVERVIEW: With the increase in on-board computational resources, optimal nonlinear control can be derived from accurate real-time computational models of a vehicle and its physical environment. This multidisciplinary project couples advances in dynamic vehicle modeling, large-scale environmental forecasting, software engineering, and recent advances in nonlinear control. The objectives are to demonstrate rapid dynamic reconfiguration of control laws and the ability to perform complex maneuvers. Progress to date has focused on optimal neural control approaches for trajectory generation of surface vessels navigating in forecasted flow fields of the Columbia River Estuary. Future direction of the program will expand to aerospace environments, and may investigate a number of possible different control paradigms CANDIDATE QUALIFICATIONS: Candidates should have a Ph.D. with strong expertise in control theory, neural networks and reinforcement learning. ADDITIONAL INFORMATION: The successful candidate will work with Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) and his research group and collaborators. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI http://ece.ogi.edu/~ericwan/ ********************************************************************* From mdorigo at ulb.ac.be Mon Jul 26 17:03:42 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Mon, 26 Jul 1999 23:03:42 +0200 Subject: Final Call for Participation and Abstracts: Fourth European Workshop on Reinforcement Learning Message-ID: Call for Abstracts: EWRL-4, Fourth European Workshop on Reinforcement Learning Lugano, Switzerland, October 29-30, 1999 (We apologize for duplicates of this email) Reinforcement learning (RL) is a growing research area. To build a European RL community and give visibility to the current situation on the old continent, we are running a now biennal series of workshops. EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy (1995), EWRL-3 in Rennes, France (1997). EWRL-4 will take place in Lugano, Switzerland (1999). The first morning will feature a plenary talk by Dario Floreano. The rest of the two-day workshop will be dedicated to presentations given by selected participants. Presentation length will be determined once we have some feedback on the number of participants. The number of participants will be limited. Access will be restricted to active RL researchers and their students. Please communicate as soon as possible, and in any case before end of July 1999, your intention to participate by means of the intention form attached below (e-mail preferred: ewrl at iridia.ulb.ac.be). Otherwise send intention forms to: Marco Dorigo IRIDIA, CP 194/6 Universite' Libre de Bruxelles Avenue Franklin Roosvelt 50 1050 Bruxelles Belgium TIMELINE: intention forms and one page abstracts should be emailed by AUGUST 15, 1999 to ewrl at iridia.ulb.ac.be Up-to-date information, including inscription fees, hotel information, etc., is maintained at: http://iridia.ulb.ac.be/~ewrl/EWRL4/EWRL4.html The Organizing Committee Marco Dorigo and Hugues Bersini, IRIDIA, ULB, Brussels, Belgium, Luca M. Gambardella and Juergen Schmidhuber , IDSIA, Lugano, Switzerland, Marco Wiering, University of Amsterdam, The Netherlands. -------------------------------------------------------------------- INTENTION FORM (to be emailed by AUGUST 15, 1999, to ewrl at iridia.ulb.ac.be) Fourth European Workshop on Reinforcement Learning (EWRL-4) Lugano, Switzerland, October 29-30, 1999 Family Name: First Name: Institution: Address: Phone No.: Fax No.: E-mail: ____ I intend to participate without giving a presentation ____ I intend to participate and would like to give a presentation with the following title: ____ MAX one page abstract: From axon at cortex.rutgers.edu Tue Jul 27 12:42:11 1999 From: axon at cortex.rutgers.edu (Ralph Siegel) Date: Tue, 27 Jul 1999 12:42:11 -0400 Subject: Vision postdoctoral position - Rutgers Message-ID: <001101bed84e$f91b09a0$21b5e6a5@stp.rutgers.edu> Postdoctoral Position in Vision Research. A position for a postdoctoral trainee interested in studying visual perception is available in the Center for Molecular and Behavioral Neuroscience at Rutgers University. The collaboration concerns exploring the computations needed for spatial perception in the parietal lobe of behaving primates using single unit recordings, intrinsic optical imaging techniques, microdialysis and psychophysical methods. Reference to recent publications on this work are available on our Web site http://www.cmbn.rutgers.edu/cmbn/faculty/rsiegel.html. Interested candidates should contact Ralph Siegel, CMBN, Rutgers University, 197 University Avenue, Newark, NJ 07102, Phone: 973-353-1080 x3261. Fax: 973-353-1272. E-mail: axon at cortex.rutgers.edu From ericwan at ece.ogi.edu Tue Jul 27 14:30:15 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Tue, 27 Jul 1999 11:30:15 -0700 Subject: OGI - Ph.D Opening Message-ID: <379DFAB7.433C7A0E@ece.ogi.edu> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** CENTER FOR SPOKEN LANGUAGE UNDERSTANDING http://cslu.cse.ogi.edu/ OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an immediate opening for an outstanding student in its Electrical and Computer Engineering Ph.D program. Full stipend and tuition will be covered. The student will specifically work with Professor Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) on a number of projects relating to neural network learning and speech enhancement. QUALIFICATIONS: The candidate should have a strong background in signal processing with some prior knowledge of neural networks. A Masters Degree in Electrical Engineering is preferred. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI ********************************************************************* OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and PhD programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From ormoneit at stat.Stanford.EDU Tue Jul 27 20:33:00 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Tue, 27 Jul 1999 17:33:00 -0700 (PDT) Subject: New Paper on Neural Networks, Filtering, and Derivatives Pricing Message-ID: <199907280033.RAA17620@rgmiller.Stanford.EDU> Dear Colleagues, Please take notice of the following working paper, available online at http://www-stat.stanford.edu/~ormoneit/clearn.ps . Best regards, Dirk ------------------------------------------------------------------ A REGULARIZATION APPROACH TO CONTINUOUS LEARNING WITH AN APPLICATION TO FINANCIAL DERIVATIVES PRICING by Dirk Ormoneit We consider the training of neural networks in cases where the nonlinear relationship of interest gradually changes over time. One possibility to deal with this problem is by regularization where a variation penalty is added to the usual mean squared error criterion. To learn the regularized network weights we suggest the Iterative Extended Kalman Filter (IEKF) as a learning rule, which may be derived from a Bayesian perspective on the regularization problem. A primary application of our algorithm is in financial derivatives pricing, where neural networks may be used to model the dependency of the derivatives' price on one or several underlying assets. After giving a brief introduction to the problem of derivatives pricing we present experiments with German stock index options data showing that a regularized neural network trained with the IEKF outperforms several benchmark models and alternative learning procedures. In particular, the performance may be greatly improved using a newly designed neural network architecture that accounts for no-arbitrage pricing restrictions. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/~ormoneit/ From Jonathan.Baxter at anu.edu.au Tue Jul 27 22:03:58 1999 From: Jonathan.Baxter at anu.edu.au (Jonathan Baxter) Date: Wed, 28 Jul 1999 12:03:58 +1000 Subject: Paper Available On Reinforcement Learning Message-ID: <9907281209350V.27898@pasiphae.anu.edu.au> The following paper can be obtained from : http://wwwsyseng.anu.edu.au/~jon/papers/drlalg.ps.gz Title: Direct Gradient-Based Reinforcement Learning: I. Gradient Estimation Algorithms Authors: Jonathan Baxter and Peter Bartlett Research School of Information Sciences and Engineering Australian National University Jonathan.Baxter at anu.edu.au, Peter.Bartlett at anu.edu.au Abstract: Despite their many empirical successes, approximate value-function based approaches to reinforcement learning suffer from a paucity of theoretical guarantees on the performance of the policy generated by the value-function. In this paper we pursue an alternative approach: first compute the gradient of the {\em average reward} with respect to the parameters controlling the state transitions in a Markov chain (be they parameters of a class of approximate value functions generating a policy by some form of look-ahead, or parameters directly parameterizing a set of policies), and then use gradient ascent to generate a new set of parameters with increased average reward. We call this method ``direct'' reinforcement learning because we are not attempting to first find an accurate value-function from which to generate a policy, we are instead adjusting the parameters to directly improve the average reward. We present an algorithm for computing approximations to the gradient of the average reward from a single sample path of the underlying Markov chain. We show that the accuracy of these approximations depends on the relationship between the discount factor used by the algorithm and the mixing time of the Markov chain, and that the error can be made arbitrarily small by setting the discount factor suitably close to $1$. We extend this algorithm to the case of partially observable Markov decision processes controlled by stochastic policies. We prove that both algorithms converge with probability 1. From ray at hip.atr.co.jp Wed Jul 28 16:39:39 1999 From: ray at hip.atr.co.jp (Thomas S.Ray) Date: Wed, 28 Jul 99 16:39:39 JST Subject: ARTIFICIAL LIFE AND ROBOTICS (AROB 5th '00) Message-ID: <9907280739.AA02152@laplace.hip.atr.co.jp> We apologize if you receive this more than once ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ THE FIFTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 5th '00) For Human Welfare and Artificial Life Robotics Jan.26-Jan.28, 2000, Compal Hall, Oita, JAPAN OBJECTIVE is to start collaborative research on development of the theories of artificial life and complexity and their applications for robotics by making several research groups where the members are experts in their fields. The research on artificial life and complexity were begun recently in Japan and are expected to be applied in various fields. Especially the research groups will focus their attention on the development of the hardware of artificial brains using neurocomputers etc. Also, the research groups will develop fundamental research on applications of artificial life and complexity for robotics by close relationships between researchers. TOPICS Artificial Brain Research Artificial Intelligence Artificial Life Artificial Living Artificial Mind Research Brain Science Chaos Cognitive Science Complexity Computer Graphics Evolutionary Computations Fuzzy Control Genetic Algorithms Innovative Computations Intelligent Control and Modeling Micromachines Micro-Robot World Cup Soccer Tournament Mobile Vehicles Molecular Biology Neural Networks Neurocomputers Neurocomputing Technologies and its Application for Hardware Robotics Robust Virtual Engineering Virtual Reality DEADLINES September 15, 1999 Submission of a 400-600 word-long abstract of a paper in English( 3 copies ). October 15, 1999 Notification of acceptance of papers. November 15, 1999 Submission of the text of the paper (camera-ready manuscript). CONFERENCE LANGUAGE English HOMEPAGE http://arob.cc.oita-u.ac.jp/ PUBLICATION Accepted papers will be published in the Proceedngs of AROB and some of high quality papers in the Proc. will be requested to re-submit for the consideration of publication in a new international journal ARTIFICIAL LIFE AND ROBOTICS (Springer) and APPLIED MATHEMATICS AND COMPUTATION (North-Holland). Send to: AROB Secretariat c/o Sugisaka Laboratory Dept of Electrical and Electronic Engineering Oita University 700 Dannoharu, Oita 879-1192, JAPAN TEL +81-97-554-7841, FAX +81-97-554-7818 E-MAIL arobsecr at cc.oita-u.ac.jp HISTORY This symposium was founded in 1996 by the support of Science and International Affairs Bureau, Ministry of Education, Science, Sports, and Culture, Japanese Government. This symposium was held in 1996 (First), 1997 (Second), 1998 (Third), 1999 (Fourth) at B-Con Plaza, Beppu, Oita, Japan. The Fifth symposium will be held in January 26-28, 2000 at Compal Hall, Oita, Japan. This symposium invites you all to discuss development of new technologies concerning Artificial Life and Robotics based on simulation and hardware in twenty first century. ORGANIZED BY Organizing Committee of International Symposium on Artificial Life and Robotics under the Sponsorship of Science and International Affairs Bureau, Ministry of Education, Science, Sports, and Culture, Japanese Government CO-SPONSORED BY Santa Fe Institute (SFI, USA) The Society of Instrument and Control Engineers (SICE, JAPAN) The Robotics Society of Japan (RSJ, JAPAN) The Institute of Electrical Engineers of Japan (IEEJ, JAPAN) COOPERATED BY ISICE IEICE IEEE Tokyo Section JARA SUPPORTED BY OITA Prefecture Government, etc. HONORARY PRESIDENT M.Hiramatsu (Governor, Oita Prefecture, Japan) ADVISORY COMMITTE CHAIRMAN M.Ito (Director, RIKEN, Japan) GENERAL CHAIRMAN M.Sugisaka (Oita University, Japan) VICE CHAIRMAN J.L.Casti (Santa Fe Institute, USA) PROGRAM CHAIRMAN H.Tanaka (Tokyo Medical & Dental University, Japan) ADVISORY COMMITTEE M.Ito (Director, RIKEN, Japan) (Chairman) H.Kimura (The University of Tokyo, Japan) S.Fujimura (The University of Tokyo, Japan) S.Ueno (Kyoto University, Japan) INTERNATIONAL ORGANIZING COMMITTEE W.B.Arthur (Santa Fe Institute, USA) W.Banzhaf (University of Dortmund, Germany) C.Barrett (Los Alamos National Laboratory, USA) J.P.Crutchfield (Santa Fe Institute, USA) J.L.Casti (Santa Fe Institute, USA) (Vice Chairman) J.M.Epstein (Santa Fe Institute, USA) T.Fukuda (Nagoya University, Japan) D.J.G.James (Coventry University, UK) S.Kauffman (Santa Fe Institute, USA) C.G.Langton (Santa Fe Institute, USA) R.G.Palmer (Santa Fe Institute, USA) S.Rasmussen (Santa Fe Institute, USA) T.S.Ray (Santa Fe Institute, USA) P.Schuster (Santa Fe Institute, USA) M.Sugisaka (Oita University, Japan) (Chairman) H.Tanaka (Tokyo Medical & Dental University, Japan) C.Taylor (University of California-Los Angeles, USA) W.R.Wells (University of Nevada-Las Vegas, USA) Y.G.Zhang (Academia Sinica, China) INTERNATIONAL STEERING COMMITTEE Z.Bubunicki (Wroclaw University of Technology, Poland) J.L.Casti (Santa Fe Institute, USA) (Co-Chairman) S.Fujimura (The University of Tokyo) D.J.G.James (Coventry University, UK) J.J.Lee (KAIST, Korea) G.Matsumoto (RIKEN, Japan) M.Nakamura (Saga University, Japan) S.Rasmussen (Santa Fe Institute, USA) T.S.Ray (Santa Fe Institute, USA) M.Sugisaka (Oita University, Japan) (Chairman) H.Tanaka (Tokyo Medical & Dental University, Japan) C.Taylor (University of California-Los Angeles, USA) W.R.Wells (University of Nevada-Las Vegas, USA) Y.G.Zhang (Academia Sinica, China) INTERNATIONAL PROGRAM COMMITTEE K.Abe (Tohoku University, Japan) K.Aihara (The University of Tokyo, Japan) (Co-Chairman) Z.Bubunicki (Wroclaw University of Technology, Poland) T.Christaller (CMD-German National Research Center for Information Technology, Germany) T.Fujii (RIKEN, Japan) M.Gen (Ashikaga Institute of Technology, Japan) T.Gomi (AAI, Canada) I.Harvey (University of Sussex, UK) H.Hashimoto (The University of Tokyo, Japan) (Co-Chairman) P.Husbands (University of Sussex, UK) J.Johnson (The Open University, UK) Y.Kakazu (Hokkaidou University, Japan) R.E.Kalaba (University of Southern California, USA) H.Kashiwagi (Kumamoto University, Japan) O.Katai (Kyoto University, JAPAN) S.Kawata (Tokyo Metropolitan Univsersity, Japan) J.H.Kim (KAIST, Korea) S.Kitamura (Kobe University, Japan) H.Kitano (Sony Computer Science Laboratory Inc., Japan) T.Kitazoe (Miyazaki University, Japan) S.Kumagai (Osaka University, Japan) H.H.Lund (University of Aarhus, Denmark) M.Nakamura (Saga University, Japan) R.Nakatsu (ATR, Japan) H.H.Natsuyama (Advanced Industrial Materials, USA) T.Omori (Tokyo University of Agriculture & Technology, Japan) R.Pfeifer (University of Zurich-Irchel, SwitZerland) T.S.Ray (Santa Fe Institute, USA) (Co- Chairman) T.Sawaragi (Kyoto University, Japan) T.Shibata (MITI, MEL, Japan) K.Shimohara (ATR, Japan) L.Steels (VUB AI Laboratory, Belgium) M.Sugisaka (Oita University, Japan) S.Tamura (Osaka University, Japan) H.Tanaka (Tokyo Medical & Dental University, Japan) (Chairman) N.Tosa (ATR, Japan) K.Ueda (Kobe University, Japan) A.P.Wang (Arizona State University, USA) K.Watanabe (Saga University, Japan) X.Yao (The University of New South Wales, Australia) LOCAL ARRANGEMENT COMMITTEE T.Hano (Oita University, Japan) T.Ito (Oita University, Japan) E.Kusayanagi (Oita University, Japan) T.Matsuo (Oita University, Japan) Y.Morita (Oita University, Japan) K.Nakano (University of Electro-Communications, Japan) K.Okazaki (Fukui University, Japan) S.Sato (Director, Research and Development Center, Oita University, Japan) K.Shigemitsu (Oita Industrial Research Institute, Japan) M.Sugisaka (Oita University, Japan) H.Tsukune (Director, Oita Industrial Research Institute, Japan) K.Yoshida (Oita University, Japan) X.Wang (Oita Institute of Technology, Japan) Y.Suzuki (Tokyo Medical & Dental University) From biehl at physik.uni-wuerzburg.de Wed Jul 28 06:31:36 1999 From: biehl at physik.uni-wuerzburg.de (Michael Biehl) Date: Wed, 28 Jul 1999 12:31:36 +0200 (METDST) Subject: preprint: noisy regression and classification Message-ID: <199907281031.MAA00326@wptx38.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/1999/WUE-ITP-99-015.ps.gz The following manuscript is now available via anonymous ftp, see below for the retrieval procedure. More conveniently, it can be obtained from the Wuerzburg Theoretical Physics preprint server in the WWW: http://theorie.physik.uni-wuerzburg.de/‾publications.shtml or from M. Biehl's personal homepage http://theorie.physik.uni-wuerzburg.de/‾biehl ------------------------------------------------------------- Ref. WUE-ITP-99-015 Noisy regression and classification with continuous mulitlayer networks M. Ahr, M. Biehl, and R. Urbanczik ABSTRACT We investigate zero temperature Gibbs learning for two classes of unrealizable rules which play an important role in practical applications of multilayer neural networks with differentiable activation functions: classification problems and noisy regression problems. Considering one step of replica symmetry breaking, we surprisingly find, that for sufficiently large training sets the stable state is replica symmetric even though the target rule is unrealizable. The common practice to approximate a classifiction scheme by a continuous regression problem is demonstrated to increase the number of examples needed for succesful training drastically. ------------------------------------------------------------------- Retrieval procedure via anonymous ftp: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1999 ftp> binary ftp> get WUE-ITP-99-015-ps.gz (*) ftp> quit unix> gunzip WUE-ITP-98-015.ps.gz e.g. unix> lp -odouble WUE-ITP-99-015.ps (*) can be replaced by "get WUE-ITP-99-015.ps". The file will then be uncompressed before transmission (slow!). ___________________________________________________________________ -- Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de www: http://theorie.physik.uni-wuerzburg.de/‾biehl Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From wolfskil at MIT.EDU Thu Jul 29 14:53:46 1999 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 29 Jul 1999 14:53:46 -0400 Subject: book announcement Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 1994 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/d34ba71b/attachment.bin From M.Orr at anc.ed.ac.uk Thu Jul 29 09:31:10 1999 From: M.Orr at anc.ed.ac.uk (Mark Orr) Date: Thu, 29 Jul 1999 14:31:10 +0100 (BST) Subject: Matlab RBF package Message-ID: <199907291331.OAA21043@bimbo.cns.ed.ac.uk> Dear Connectionists, A updated version of my Matlab software package for RBF networks is available at http://www.anc.ed.ac.uk/‾mjo/rbf.html The methods based on ridge regression and forward selection have been improved with the addition of a mechanism for adapting the basis function widths and there are two new methods based on Kubat's idea of using regression trees to set the centres and widths. Mark Orr mark at anc.ed.ac.uk From obermaier at dpmi.tu-graz.ac.at Thu Jul 1 04:40:43 1999 From: obermaier at dpmi.tu-graz.ac.at (Bernhard Obermaier) Date: Thu, 01 Jul 1999 10:40:43 +0200 Subject: Paper available: HMM used for Brain Computer Interface Message-ID: <377B298B.6981C13B@dpmi.tu-graz.ac.at> The following paper 'HMM used for the offline classification of EEG data' is published in 'Biomedizinsche Technik' 1999/6 and is available from my web page: http://www-dpmi.tu-graz.ac.at/‾obermai/ Abstract: Hidden Markov models (HMM) are introduced for the offline classification of single-trail EEG data in a brain-computer interface (BCI). The HMMs are used to classify Hjorth parameters calculated from bipolar EEG data, recorded during the imagination of a left or right hand movement. The effects of different types of HMMs on the recognition rate are discussed. Furthermore a comparison of the results achieved with the linear discriminat (LD) and the HMM, is presented. ----------------------------------------------------------------------------------------- Dipl.-Ing. Bernhard Obermaier obermaier at dpmi.tu-graz.ac.at Institute of Biomedical Engineering Department of Medical Informatics Infeldgasse 16a, A-8010 Graz, Austria Phone: +43-316-873-5311 | Fax: +43-316-873-5349 ----------------------------------------------------------------------------------------- From mozer at cs.colorado.edu Thu Jul 1 16:06:13 1999 From: mozer at cs.colorado.edu (Mike Mozer) Date: Thu, 01 Jul 99 14:06:13 -0600 Subject: position available in speech recognition research Message-ID: <199907012006.OAA19334@neuron.cs.colorado.edu> Senior Research Technologist Sensory Inc. Sunnyvale, CA http://www.sensoryinc.com Working under general supervision, this person will develop novel speech encoding and recognition algorithms, and will adapt existing algorithms to low-end platforms with memory and processor-speed constraints. Primary Responsibilities/Duties * Provides in-house corporate expertise for traditional methods of speech recognition. * Working from general concepts, develops specifications and models for speech processing and recognition algorithms. * Develops testing and cross-validation protocols, methods, and tools for performance evaluation and improvement. * Exercises experience and knowledge to optimize software performance consistent with constraints imposed by implementation on Sensory's custom ICs. * Works with assembly-language programmers to assist in porting algorithms from C to chip implementation. Position Requirements, Desired Experience and Skills: * MS in computer science, engineering, applied statistics, computational linguistics, or cognitive science; Ph.D. preferred * Strong academic, research, and/or development background in signal processing for speech and automatic speech recognition. * Minimum of 2 years' experience with speech feature encoding and recognition algorithms (including HMMs, neural nets, vector quantization) * Experience in DSP desirable. Knowledge of LPC a plus. * Fluency in C * State-of-the-art knowledge of machine learning and speech recognition techniques * The candidate should enjoy discussing and debating complex technical issues. For additional information, please see Sensory Inc.'s web site (www.sensoryinc.com). From RK at hirn.uni-duesseldorf.de Thu Jul 1 16:29:46 1999 From: RK at hirn.uni-duesseldorf.de (Rolf Kotter) Date: Thu, 01 Jul 1999 22:29:46 +0200 Subject: Special Issue "Computational Neuroscience" Message-ID: <377BCFBA.F37F2C91@hirn.uni-duesseldorf.de> [apologies if you receive this announcement via several routes] REVIEWS IN THE NEUROSCIENCES SPECIAL ISSUE Computational Neuroscience Guest Editor: Rolf Ktter Contents: Trends in European Computational Neuroscience Rolf Ktter Contrast Adaptation and Infomax in Visual Cortical Neurons Pter Adorjn, Christian Piepenbrock, Klaus Obermayer Single Cell and Population Activities in Cortical-like Systems Flp Bazs, dm Kepecs, Mt Lengyel, Szabolcs Payrits, Krisztina Szaliszny, Lszl Zalnyi, Pter rdi Computational Models of Predictive and Memory-related Functions of the Hippocampus Roman Borisyuk, Michael Denham, Susan Denham, Frank Hoppensteadt Using Realistic Models to Study Synaptic Integration in Cerebellar Purkinje Cells Erik De Schutter Towards an Integration of Biochemical and Biophysical Models of Neuronal Information Processing: A Case Study in the Nigro-striatal System Rolf Ktter, Dirk Schirok Interaction of Cortex and Hippocampus in a Model of Amnesia and Semantic Dementia Jaap M.J. Murre Properties of the Evoked Spatio-temporal Electrical Activity in Neuronal Assemblies Giulietta Pinato, Pietro Parodi, Alessandro Bisso, Domenico Macr, Akio Kawana, Yasuhiko Jimbo, Vincent Torre What Can Robots Tell Us About Brains? A Synthetic Approach Towards the Study of Learning and Problem Solving Thomas Voegtlin, Paul F.M.J. Verschure Publication date: October, 1999. Further information and download of editorial: http://www.hirn.uni-duesseldorf.de/‾rk/rins_toc.htm From foster at Basit.COM Thu Jul 1 16:49:12 1999 From: foster at Basit.COM (Foster John Provost) Date: Thu, 1 Jul 1999 16:49:12 -0400 (EDT) Subject: KDD-99 Contest Announcement Message-ID: <199907012049.QAA00693@knowledge.basit.com> 1999 KDD-CUP Contests This year there will be two contests in association with the 1999 ACM SIGKDD conference in San Diego. One contest involves open-ended knowledge discovery, using clustering algorithms, rule discovery, and other methods for acquiring high-level knowledge from commercial data. The other contest involves building a classifier for detecting computer network intrusions from a very large database of network traffic records. The deadline for submitting entries is July 25. Awards will be presented at KDD-99 in August. For further details see the KDD-99 home page: http://research.microsoft.com/datamine/kdd99/ which will take you to the 1999 KDD-CUP web page: http://www.deas.harvard.edu/courses/cs281r/cup99.html Important note: the second URL may change, and updated contest information may be posted at any time. Those interested should monitor the KDD-99 home page. From ml_conn at infrm.kiev.ua Fri Jul 2 05:49:31 1999 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Fri, 2 Jul 1999 11:49:31 +0200 (UKR) Subject: Connectionist symbol processing: any progress? References: Message-ID: Keywords: compositional distributed representations, sparse coding, binary coding, binding, representation of structure, recursive representation, nested representation, connectionist symbol processing, long-term memory, associative-projective neural networks, analogical reasoning. Dear Colleagues, The following paper draft (abstract enclosed) is available at: http://cogprints.soton.ac.uk/abs/comp/199907001 Dmitri A. Rachkovskij "Representation and Processing of Structures with Binary Sparse Distributed Codes". Also you may be interested in a related paper (abstract enclosed): http://cogprints.soton.ac.uk/abs/comp/199904008 Dmitri A. Rachkovskij & Ernst M. Kussul "Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning". Comments are welcome! Thank you and best regards, Dmitri Rachkovskij Encl: Representation and Processing of Structures with Binary Sparse Distributed Codes Abstract: The schemes for compositional distributed representations include those allowing on-the-fly construction of fixed dimensionality codevectors to encode structures of various complexity. Similarity of such codevectors takes into account both structural and semantic similarity of represented structures. In this paper we provide a comparative description of sparse binary distributed representation developed in the frames of the Associative-Projective Neural Network architecture and more well-known Holographic Reduced Representations of Plate and Binary Spatter Codes of Kanerva. The key procedure in Associative-Projective Neural Networks is Context-Dependent Thinning which binds codevectors and maintains their sparseness. The codevectors are stored in structured memory array which can be realized as distributed auto-associative memory. Examples of distributed representation of structured data are given. Fast estimation of similarity of analogical episodes by the overlap of their codevectors is used in modeling of analogical reasoning for retrieval of analogs from memory and for analogical mapping. -------- Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning Abstract: Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations oftraditional AI, as well as to the localist and microfeature-based connectionist representations. ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 22, 252022, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* From espaa at exeter.ac.uk Fri Jul 2 04:51:45 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Fri, 2 Jul 1999 09:51:45 +0100 (GMT Daylight Time) Subject: PAA Content of Papers In-Reply-To: Message-ID: PATTERN ANALYSIS AND APPLICATIONS JOURNAL SPRINGER-VERLAG SPECIAL ISSUE ON NEURAL NETWORKS IN IMAGE PROCESSING JUNE 1999, vol. 2, issue 2 http://www.dcs.exeter.ac.uk/paa Full Details with Abstracts at: http://www.dcs.ex.ac.uk/paa/vol2.htm ____________________________________ "Image Feature Extraction and Denoising by Sparse Coding" Erkki Oja, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland Aapo Hyvarinen, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland Patrick Hoyer, Helsinki University of Technology, Laboratory of Computer and Information Science, Finland pp. 104-110 "The Applicability of Neural Networks to Non-linear Image Processing" Dick de Ridder, Applied Physics Department, Delft University of Technology, Delft, Netherlands Robert P.W. Duin, Applied Physics Department, Delft University of Technology, Delft, Netherlands Piet W Verbeek, Applied Physics Department, Delft University of Technology, Delft, Netherlands Lucas J van Vliet, Applied Physics Department, Delft University of Technology, Delft, Netherlands pp. 111-128 "Outdoor Scene Classification by a Neural Tree-based Approach" G L Foresti, Department of Mathematics and Computer Science (DIMI), University of Udine, Udine, Italy pp. 129-142 "A Neural Network Approach to Planar-Object Recognition in 3D Space" H C Sim, Image, Speech and Intelligent Systems Research Group, Department of Electronics and Computer Science, University of Southampton, UK R I Draper, Image, Speech and Intelligent Systems Research Group, Department of Electronics and Computer Science, University of Southampton, UK pp. 143-163 "Color Image Indexing Using SOM for Region-of-Interest Retrieval" Tao Chen, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore Li-Hui Chen, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore Kai-Kuang Ma, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore pp. 164-171 "Detection of Bone Tumours in Radiographic Images using Neural Networks" M Egmont-Petersen, Division of Image Processing, Dept. of Radiology, Leiden University Medical Centre, The Netherlands E Pelikan, Scientific Technical Dept., Philips Medical Systems, Hamburg, Germany pp. 172-183 "Application of a Steerable Wavelet Transform using Neural Network for Signature Verification" Emad A. Fadhel, Department of Computer Science and Engineering, IIT-Bombay, India P Bhattacharya, Department of Computer Science and Engineering, IIT-Bombay, India pp. 184-195 "Reducing the Dimensions of Texture Features for Image Retrieval Using Multi-layer Neural Networks" Jose Antonio Catalan, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia Jesse Jin, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia Tom Gedeon, School of Computer Science and Engineering, University of New South Wales, Sydney, Australia 196-203 __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From NEuroNet at kcl.ac.uk Fri Jul 2 05:47:20 1999 From: NEuroNet at kcl.ac.uk (NEuroNet) Date: Fri, 2 Jul 1999 10:47:20 +0100 (BST) Subject: NEuroNet WWW Site Message-ID: <199907020947.KAA29178@mail.kcl.ac.uk> This is to announce that NEuroNet, the European "Network of Excellence" in Neural Networks (funded by the 4th Framework Programme of the European Union) has a new WWW site at: http://www.kcl.ac.uk/neuronet/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Ms Terhi V Manuel-Garner, MA Administrator to NEuroNet NEuroNet, European Network of Excellence in Neural Networks Department of Electronic Engineering King's College London, Strand, London WC2R 2LS, UK Tel.: +44 (0) 171 848 2388 Fax: +44 (0) 171 848 2559 http://www.kcl.ac.uk/neuronet Email: NEuroNet at kcl.ac.uk - PLEASE NOTE NEW TELEPHONE AND FAX NOS - * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * From i.rezek at ic.ac.uk Fri Jul 2 12:31:30 1999 From: i.rezek at ic.ac.uk (Iead Rezek) Date: Fri, 02 Jul 1999 17:31:30 +0100 Subject: PH.D. STUDENT RESEARCH POSITION AT U.B.C. Message-ID: <377CE962.CE67477D@ic.ac.uk> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** Electrical & Computer Engineering Dept. University of British Columbia (UBC) http://www.ece.ubc.ca The Pattern Recognition research group in the Electrical & Computer Engineering Dept. of the University of British Columbia (UBC), seeks an appropriately qualified individual to work on the development of Software Engineering methodologies & tools for Pattern Recognition research and development. The applicant will have to apply for enrollment in a Ph.D. program, under the supervision of Prof. Rabab Ward FRSC, and will be expected to interact and cooperate with the other members of the research group. A full description of the project is attached to this email. Potential applicants please contact Dr. Nawwaf Kharma for an interview. The last date for submission of applications is July 31st 99 (for the session starting January 1st 2000). Late applicants will be considered for next academic year. The successful applicant will have to provide for one full year of tuition & maintenance costs, however, he/she can expect to work as a Teaching Assistant, during the following the years, for a total salary of about C$ 17K (enough to cover tuition plus most maintenance costs). We would be very grateful if you would post or share this information with other people interested in a Ph.D. in Pattern Recognition & Software Engineering, at UBC. Nawwaf N. Kharma E&CE Dept. UBC 2356 Main Mall, Vancouver, B.C. Canada V6T 1Z4 Tel: +1-604-822 1742 Fax: +1-604-822-5949 Email: nawwaf at ieee.org ================================================================ Ph.D. Project Description Purpose The aim of this project is to investigate the theoretical and pragmatic aspects of using guided random optimization techniques (e.g. Genetic Algorithms) for the automatic construction of pattern recognition software systems. The final goal of this project is the development of a software system capable (with little human intervention) of constructing another software system, one, which is (near-) optimally, suited to a specific recognition task. An example of a recognition task would be the recognizing of the set of words making up a hand-written Arabic sentence. Background & Motivation A Character recognition system may be broken up, functionally, into four components. One component carries out 'pre-processing' functions, such as normalization and thinning. Another component accepts the pre-processed input pattern and extracts those features that best characterize the pattern. The extracted features are used by a 'classification' component (such as a Neural Net) to assign a label to the pattern. All functions carried out after (initial) classification fall under 'post-processing'. [Image] [Image]The majority of the effort expended in the field of character recognition has been directed towards the discovery of new algorithms; ones that pre-process the input pattern, that extract the most relevant features, or that are most able to classify the patterns with increased certainty and efficiency. However, almost all of this effort has been done manually, though, there is no reason why a lot of this work cannot, in the future, be done by machines. This is exactly the reason for our interest in any/all techniques that are potentially capable of emulating some of Man's unique ability to search and optimize, and to do so efficiently. Objectives specific objectives of this project are: - To achieve a clear and comprehensive understanding of the processes of feature selection, and classification- both in character recognition systems and in humans. - To survey the wide rage of genetic and other evolutionary computation algorithms (& software), currently used in the realm of Pattern Recognition, and specifically: Character Recognition. - To acquire a good degree of working knowledge of recent methodologies of software engineering; such as object-oriented methodologies (including UML), as well as a reasonable level of software testing expertise. - To propose a hypothesis relating to the theoretical and pragmatic aspects of the automatic development of character/pattern recognition software systems. - To methodically specify, design, and code a software development platform capable of authoring/customizing software for specific character recognition purposes (in line with hypothesis- above.) - To validate the software development platform via extensive testing, including using it to actually construct a simple (but complete) example of a character recognition software. - To document all the above via various progress reports, scientific papers, a software user's manual, and (of course) a thesis. Skills Required Very good programming skills (in Delphi/C++), plus an interest in Software Engineering. Some basic background in Character Recognition or/and the ability to autonomously learn new concepts and techniques in Structural/Statistical Pattern Recognition. Good writing, presentation, and communication skills (in English). Patience! From ormoneit at stat.Stanford.EDU Fri Jul 2 20:24:18 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Fri, 2 Jul 1999 17:24:18 -0700 (PDT) Subject: Report on Local Linear Regression Message-ID: <199907030024.RAA15468@rgmiller.Stanford.EDU> The following technical report is now available on-line at http://www-stat.stanford.edu/‾ormoneit/tr-1999-11.ps Best, Dirk ------------------------------------------------------------------ OPTIMAL KERNEL SHAPES FOR LOCAL LINEAR REGRESSION by Dirk Ormoneit and Trevor Hastie Local linear regression performs very well in many low-dimensional forecasting problems. In high-dimensional spaces, its performance typically decays due to the well-known ``curse-of-dimensionality''. Specifically, the volume of a weighting kernel that contains a fixed number of samples increases exponentially with the number of dimensions. The bias of a local linear estimate may thus become unacceptable for many real-world data sets. A possible way to control the bias is by varying the ``shape'' of the weighting kernel. In this work we suggest a new, data-driven method to estimating the optimal kernel shape. Experiments using two artificially generated data sets and data from the UC Irvine repository show the benefits of kernel shaping. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/‾ormoneit/ From rreed at wport.com Sun Jul 4 00:01:02 1999 From: rreed at wport.com (Russ Reed) Date: Sat, 3 Jul 1999 21:01:02 -0700 Subject: new book Message-ID: This may interest readers of this list. NEW BOOK: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks Russell D. Reed & Robert J. Marks II (MIT Press, 1999). Contents: Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The basic idea is that massive systems of simple units linked together in appropriate ways can generate many complex and interesting behaviors. This book focuses on the subset of feedforward artificial neural networks called multilayer perceptions (MLP). These are the most widely used neural networks, with applications as diverse as finance (forecasting), manufacturing (process control), and science (speech and image recognition). This book presents an extensive and practical overview of almost every aspect of MLP methodology, progressing from an initial discussion of what MLPs are and how they might be used to an in-depth examination of technical factors affecting performance. The book can be used as a tool kit by readers interested in applying networks to specific problems, yet it also presents theory and references outlining the last ten years of MLP research. Table of Contents Preface 1 Introduction 1 2 Supervised Learning 7 3 Single-Layer Networks 15 4 MLP Representational Capabilities 31 5 Back-Propagation 49 6 Learning Rate and Momentum 71 7 Weight-Initialization Techniques 97 8 The Error Surface 113 9 Faster Variations of Back-Propagation 135 10 Classical Optimization Techniques 155 11 Genetic Algorithms and Neural Networks 185 12 Constructive Methods 197 13 Pruning Algorithms 219 14 Factors Influencing Generalization 239 15 Generalization Prediction and Assessment 257 16 Heuristics for Improving Generalization 265 17 Effects of Training with Noisy Inputs 277 A Linear Regression 293 B Principal Components Analysis 299 C Jitter Calculations 311 D Sigmoid-like Nonlinear Functions 315 References 319 Index 339 Ordering information: 1. MIT Press http://mitpress.mit.edu/book-home.tcl?isbn=0262181908 2. amazon.com http://www.amazon.com/exec/obidos/ASIN/0262181908/qid%3D909520837/sr%3D1-21/ 002-3321940-3881246 3. Barnes & Nobel http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=1KKG10OPZT& mscssid=A7M4XXV5DNS12MEG00CGNDBFPT573NJS&pcount=&isbn=0262181908 4. buy.com http://www.buy.com/books/product.asp?sku=30360116 From sbay at algonquin.ics.uci.edu Thu Jul 1 02:06:13 1999 From: sbay at algonquin.ics.uci.edu (Stephen D. Bay) Date: Wed, 30 Jun 1999 23:06:13 -0700 Subject: The UCI KDD Archive Message-ID: <9906302306.aa02757@paris.ics.uci.edu> ************************************************** The UCI KDD Archive Call for Datasets http://kdd.ics.uci.edu/ ************************************************** The UC Irvine Knowledge Discovery in Databases (KDD) Archive is a new online repository (http://kdd.ics.uci.edu/) of large datasets which encompasses a wide variety of data types, analysis tasks, and application areas. The primary role of this repository is to serve as a benchmark testbed to enable researchers in knowledge discovery and data mining to scale existing and future data analysis algorithms to very large and complex data sets. This archive is supported by the Information and Data Management Program at the National Science Foundation, and is intended to expand the current UCI Machine Learning Database Repository (http://www.ics.uci.edu/‾mlearn/MLRepository.html) to datasets that are orders of magnitude larger and more complex. We are seeking submissions of large, well-documented datasets that can be made publicly available. Data types and tasks of interest include, but is not limited to: Data Types Tasks multivariate classification time series regression sequential clustering relational density estimation text/web retrieval image causal modeling spatial visualization multimedia discovery transactional exploratory data analysis heterogeneous data cleaning sound/audio recommendation systems Submission Guidelines: Please see the UCI KDD Archive web site for detailed instructions. Stephen Bay (sbay at ics.uci.edu) librarian From ralfh at cs.tu-berlin.de Tue Jul 6 05:19:57 1999 From: ralfh at cs.tu-berlin.de (Ralf Herbrich) Date: Tue, 6 Jul 1999 11:19:57 +0200 (MET DST) Subject: TR announcement Message-ID: We would like to announce the availability of the following technical report Bayesian Learning in Reproducing Kernel Hilbert Spaces Ralf Herbrich, Thore Graepel, Colin Campbell Abstract Support Vector Machines find the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere define the support vectors. An even more promising approach is to construct the hypothesis using the whole of version space. This is achieved by the Bayes point: the midpoint of the region of intersection of all hyperplanes bisecting version space into two volumes of equal magnitude. It is known that the centre of mass of version space approximates the Bayes point. The centre of mass is estimated by averaging over the trajectory of a billiard in version space. We derive bounds on the generalisation error of Bayesian classifiers in terms of the volume ratio of version space and parameter space. This ratio serves as an effective VC dimension and greatly influences generalisation. We present experimental results indicating that Bayes Point Machines consistently outperform Support Vector Machines. Moreover, we show theoretically and experimentally how Bayes Point Machines can easily be extended to admit training errors. The gziped PS file can be downloaded at http://stat.cs.tu-berlin.de/‾ralfh/bayes.ps.gz Best regards Ralf Herbrich, Thore Graepel, and Colin Campbell ------------------------------------------------------------------- Ralf Herbrich phone : +49-30-314-25817 TU Berlin email : ralfh at cs.tu-berlin.de FB 13; FR 6-9 URL : http://stat.cs.tu-berlin.de/‾ralfh 10587 Berlin Germany [teaching assistant in the statistics group at the TU Berlin] ------------------------------------------------------------------ From oby at cs.tu-berlin.de Tue Jul 6 11:04:39 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 6 Jul 1999 17:04:39 +0200 (MET DST) Subject: preprints available Message-ID: <199907061504.RAA03798@pollux.cs.tu-berlin.de> Dear connectionists, attached please find abstracts and preprint locations of four manuscripts on ANN theory and one short manuscript on visual cortex modelling: --- 1. self-organizing maps for similarity data & active learning (book chapter) 2. support vector learning for ordinal data (conference paper) 3. classification on proximity data with LP-machines (conference paper) 4. neural networks in economics (review) --- 5. contrast response and orientation tuning in a mean field model of visual cortex (conference paper) --- Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Active Learning in Self-Organizing Maps M. Hasenj¥"ager^1, H. Ritter^1 and K. Obermayer^2 ^1 Technische Fakult¥"at, Universit¥"at Bielefeld, ^2 Fachbereich Informatik, Technische Universitaet Berlin The self-organizing map (SOM) was originally proposed by T. Kohonen in 1982 on biological grounds and has since then become a widespread tool for explanatory data analysis. Although introduced as a heuristic, SOMs have been related to statistical methods in recent years, which led to a theoretical foundation in terms of cost functions as well as to extensions to the analysis of pairwise data, in particular of dissimilarity data. In our contribution, we first relate SOMs to probabilistic autoencoders, re-derive the SOM version for dissimilarity data, and review part of the above-mentioned work. Then we turn our attention to the fact, that dissimilarity-based algorithms scale with O($D^2$), where {¥it D} denotes the number of data items, and may therefore become impractical for real-world datasets. We find that the majority of the elements of a dissimilarity matrix are redundant and that a sparse matrix with more than 80% missing values suffices to learn a SOM representation of low cost. We then describe a strategy how to select the most informative dissimilarities for a given set of objects. We suggest to select (and measure) only those elements whose knowledge maximizes the expected reduction in the SOM cost function. We find that active data selection is computationally expensive, but may reduce the number of necessary dissimilarities by more than a factor of two compared to a random selection strategy. This makes active data selection a viable alternative when the cost of actually measuring dissimilarities between data objects comes high. in: Kohonen Maps (Eds. E. Oja and S. Kaski), Elsevier, pp. 57-70 (1999). available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Support vector learning for ordinal regression R. Herbrich, T. Graepel, and K. Obermayer Fachbereich Informatik, Technische Universit¥"at Berlin We investigate the problem of predicting variables of ordinal scale. This task is referred to as {¥em ordinal regression} and is complementary to the standard machine learning tasks of classification and metric regression. In contrast to statistical models we present a distribution independent formulation of the problem together with uniform bounds of the risk functional. The approach presented is based on a mapping from objects to scalar utility values. Similar to Support Vector methods we derive a new learning algorithm for the task of ordinal regression based on large margin rank boundaries. We give experimental results for an information retrieval task: learning the order of documents w.r.t.¥ an initial query. Experimental results indicate that the presented algorithm outperforms more naive approaches to ordinal regression such as Support Vector classification and Support Vector regression in the case of more than two ranks. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Classification on proximity data with LP-machines T. Graepel^1, R. Herbrich^1, B. Sch¥"ollkopf^2, A. Smola^2, P. Bartlett^3, K. M¥"uller^2, K. Obermayer^1, and R. Williamson^3 ^1 Fachbereich Informatik, Technische Universit¥"at Berlin ^2 GMD FIRST ^3 Australian National University We provide a new linear program to deal with classification of data in the case of data given in terms of pairwise proximities. This allows to avoid the problems inherent in using feature spaces with indefinite metric in Support Vector Machines, since the notion of a margin is purely needed ininput space where the classification actually occurs. Moreover in our approach we can enforce sparsity in the proximity representation by sacrificing training error. This turns out to be favorable for proximity data. Similar to $¥nu$--SV methods, the only parameter needed in the algorithm is the (asymptotical) number of data points being classified with a margin. Finally, the algorithm is successfully compared with $¥nu$--SV learning in proximity space and $K$--nearest-neighbors on real world data from Neuroscience and molecular biology. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference ----------------------------------------------------------------------------- Neural Networks in Economics: Background, Applications and New Developments R. Herbrich, M. Keilbach, T. Graepel, and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Neural Networks were developed in the sixties as devices for classification and regression. The approach was originally inspired from Neuroscience. Its attractiveness lies in the ability to learn, i.e. to generalize to as yet unseen observations. One aim of this paper is to give an introduction to the technique of Neural Networks and an overview of the most popular architectures. We start from statistical learning theory to introduce the basics of learning. Then, we give an overview of the general principles of neural networks and of their use in the field of Economics. A second purpose is to introduce a recently developed Neural Network Learning technique, so called Support Vector Network Learning, which is an application of ideas from statistical learning theory. This approach has shown very promising results on problems with a limited amount of training examples. Moreover, utilizing a technique that is known as the kernel trick, Support Vector Networks can easily be adapted to nonlinear models. Finally, we present an economic application of this approach from the field of preference learning. in: Computational Techniques for Modelling Learning in Economics (Ed. T. Brenner), Kluwer Academics, in press (1999) available at: http://ni.cs.tu-berlin.de/publications/#books ----------------------------------------------------------------------------- On the Influence of Threshold Variability in a Model of the Visual Cortex H. Bartsch, M. Stetter, and K. Obermayer Fachbereich Informatik, Technische Universit¥"at Berlin Orientation--selective neurons in monkeys and cats show contrast saturation and contrast--invariant orientation tuning. Recently proposed models for orientation selectivity predict contrast invariant orientation tuning but no contrast saturation at high strength of recurrent intracortical coupling, whereas at lower coupling strengths the contrast response saturates but the tuning widths are contrast dependent. In the present work we address the question, if and under which conditions the incorporation of a stochastic distribution of activation thresholds of cortical neurons leads to the saturation of the contrast response curve as a network effect. We find that contrast saturation occurs naturally if two different classes of inhibitory inter-neurons are combined. Low threshold inhibition keeps the gain of the cortical amplification finite, whereas high threshold inhibition causes contrast saturation. in: International Conference for Artificial Neural Networks 1999 (accepted for publication) available at: http://ni.cs.tu-berlin.de/publications/#conference From fmdist at hotmail.com Tue Jul 6 13:08:37 1999 From: fmdist at hotmail.com (Fionn Murtagh) Date: Tue, 06 Jul 1999 10:08:37 PDT Subject: Post-Doc Research Fellowship available, 3 yrs Message-ID: <19990706170837.65673.qmail@hotmail.com> Post-Doc Research Fellowship available, 3 yrs Research areas of particular interest - non-exhaustive: - image display and transfer, eyegaze monitoring - image and video compression, videoconferencing - wavelet transforms, multiscale methods, neural networks - multivariate data analysis, signal processing - space, medical, financial data and image analysis Information: Prof. F. Murtagh, School of Computer Science, The Queen's University of Belfast, Belfast BT7 1NN, Nth Ireland http://www.cs.qub.ac.uk/‾F.Murtagh, email f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From klweber at Math.TU-Cottbus.DE Tue Jul 6 05:35:58 1999 From: klweber at Math.TU-Cottbus.DE (Klaus Weber) Date: Tue, 6 Jul 1999 11:35:58 +0200 (MET DST) Subject: NC'2000 Announcement and CfP Message-ID: <199907060935.LAA02849@vieta.math.tu-cottbus.de.math.tu-cottbus.de> ANNOUNCEMENT / CALL FOR PAPERS Second International ICSC Symposium on NEURAL COMPUTATION / NC'2000 To be held at the Technical University of Berlin, Germany May 23-26, 2000 http://www.icsc.ab.ca/nc2000.htm SYMPOSIUM CHAIR Prof. Hans Heinrich Bothe Oerebro University Dept. of Technology Science Fakultetsgatan 1 S - 70182 Oerebro, Sweden Email: hans.bothe at ton.oru.se Phone: +46-19-10-3786 Fax: +46-19-10-3463 and Technical University of Denmark (DTU) Department of Information Technology Building 344 DK-2800 Lyngby, Denmark Email: hhb at it.dtu.dk Phone: +45-4525-3632 Fax: +45-4588-0117 PUBLICATION CHAIR Prof. Raul Rojas Freie Universit=E4t Berlin Institut Informatik / FB Mathematik Takustrasse 9 D - 14195 Berlin / Germany Email: rojas at inf.fu-berlin.de Fax: +49-30-8387-5109 SYMPOSIUM ORGANIZER ICSC International Computer Science Conventions P.O. Box 657 CH-8055 Zurich / Switzerland Phone: +41-878-888-150 Fax: +41-1-761-9627 Email: icsc at icsc.ch WWW: http://www.icsc.ab.ca INTERNATIONAL PROGRAM COMMITTEE Igor Aleksander, Imperial College of Science & Technology, London, U.K. Peter G. Anderson, Rochester Institute of Technology, NY, USA Horst Bischof, Technical University Vienna, Austria Ruediger W. Brause, J.W. Goethe-University, Frankfurt, Germany Juan Lopez Coronado, Universidad Polotecnica de Cartagena, Spain Ludwig Cromme, Brandenburgische Technische Universitaet Cottbus, Germany Chris deSilva, University of Western Australia Crawley, Australia Georg Dorffner, Austrian Research Institute for Artificial Intelligence, Vienna, Austria Kunihiko Fukushima, The University of Electro-Communications, Tokyo, Japan Wulfram Gerstner, EPFL Lausanne, Switzerland Stan Gielen, University of Nijmegen, Netherlands Marco Gori, University of Siena, Italy Bruce Graham, University of Edinburgh, U.K. Gunham Dundar, Bosporus University, Istanbul, Turkey Chris J. Harris, University of Southampton, U.K. Dorothea Heiss-Czedik, Technical University Vienna, Austria Michael Heiss, Siemens Vienna, Austria Lakhmi C. Jain, University of South Australia, The Levels, Australia Nikola Kasabov, University of Otago, Dunedin, New Zealand Bart Kosko, University of Southern California, Los Angeles CA, USA Rudolf Kruse, University of Magedburg, Germany Fa-Long Luo, R&D Department, Redwood City, USA G.Nicolas Marichal, University of La Laguna Tenerife, Spain Giuseppe Martinelli, University of Rome 1, Italy Fazel Naghdy, University of Wollongong, Australia Sankar K. Pal, Indian Statistical Institute, Calcutta, India M. Palaniswami, University of Melbourne, Australia Yoh-Han Pao, Case Western Reserve University, Cleveland OH, USA Alexander V. Pavlov, Laboratory for Optical Fuzzy Systems, Russia Witold Pedrycz, University of Alberta, Canada Raul Rojas, Freie Universit=E4t Berlin, Germany V. David Sanchez A., Falon, Inc., San Diego CA, USA Bernd Schuermann, Siemens ZFE, Munich, Germany J.S. Shawe-Taylor, Royal Holloway University of London, U.K. Peter Sincak, Technical University of Kosice, Slovakia Nigel Steele, Coventry University, U.K. Rainer Stotzka, Forschungszentrum Karlsruhe, Germany Piotr Szczepaniak, Technical University of Lodz, Poland Csaba Szepesvari, University of Szeged, Hungary Henning Tolle, Technische Hochschule Darmstadt, Germany Shiro Usui, Toyohashi University of Technology, Toyohashi, Japan Klaus Weber, Technical University of Cottbus, Germany Andreas Weingessel, Technical University Vienna, Austria Takeshi Yamakawa, Kyushu Institute of Technology, Fukuoka, Japan Andreas Zell, Universitaet Tuebingen, Germany Jacek M. Zurada, University of Louisville, K.Y., USA (list incomplete) ************************************************* INTRODUCTION The science of neural computation focusses on mathematical aspects to solve complex practical problems, and it also seeks to help neurology, brain theory and cognitive psychology in the understanding of the functioning of the nervous system by means of computational models of neurons, neural nets and subcellular processes. NC'2000 aims to become a major point of contact for research scientists, engineers and practitioners throughout the world in the field of Neural Computation. Participants will share the latest research, developments and ideas in the wide arena of disciplines encompassed under the heading of NC'2000 as a follow-up of the most successful NC'98 conference in Vienna, Austria. ************************************************* TOPICS Contributions are sought in areas based on the list below, which is indicative only. Contributions from new applications areas are welcome. COMPUTATIONAL NEURAL NETWORK MODELS - Artificial neural network paradigms - Knowledge representation - Learing and generalization - Probabilistic neural networks - Information theoretic approaches - Time-coded neural networks - Pulse-coded neural networks - Self-organization - Cellular automata - Hybrid systems (e.g. neuro-fuzzy, GA, evolutionary strategies) - Chaos in neural networks - Statistical and numerical aspects NEUROPHYSIOLOGICALLY INSPRIED MODELS - Neurophysiological foundations - Spiking neuron models and neuron assemblies - Models of brain centers and sensory pathways - Sensormotor integration - Sensation, Perception and Attention - Spatio-temporal Orientation - Reactive Behavior SOFTWARE AND HARDWARE IMPLEMENTATIONS - Simulation and Graphical Programming Tools - Distributed Systems - Neuro-chips, -controllers and -computers - Analog and Digital Electronic Implementations - Optic, Holographic Implementations NEURAL NETWORK APPLICATIONS - Pre-processing and Feature Extraction - Sound, Speech and Image Processing - Pattern Recognition and System Identification - Computer Vision, Feature Binding and Image Understanding - Autonomous Sensor Systems, Multivariate Sensor Fusion - Robotics and Control - Behavior based Exploration and Planning - Power Systems - Environmental Systems - Decision Support Systems - Medical Applications - Operational Research and Logistics ************************************************* SCIENTIFIC PROGRAM NC'2000 will include invited plenary talks, contributed sessions, invited sessions, workhops and tutorials. ************************************************* INVITED SESSIONS The organization of invited sessions is encouraged. Prospective organizers are requested to send a session proposal (consisting of 4-5 invited papers, the recommended session-chair and co-chair, as well as a short statement describing the title and the purpose of the session to the Symposium Chairman or the Symposium Organizer. Invited sessions should preferably start with a tutorial paper. The registration fee of the session organizer will be waived, if at least 4 authors of invited papers register to the conference. ************************************************* POSTER PRESENTATIONS Poster presentations are encouraged for people who wish to receive peer feedback and practical examples of applied research are particularly welcome. Poster sessions will allow the presentation and discussion of respective papers, which will also be included in the conference proceedings. ************************************************* WORKSHOPS, TUTORIALS AND OTHER CONTRIBUTIONS Proposals should be submitted as soon as possible to the Symposium Chairman or the Symposium Organizer. ************************************************* SUBMISSION OF PAPERS Prospective authors are requested to either send a draft paper or an extended abstract for review by the International Program Committee. All papers must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. Submissions must be received by October 31, 1999. Regular papers, as well as poster presentations, tutorial papers and invited sessions are encouraged. The abstract should also include: - Title of conference (NC'2000) - Type of paper (regular, poster, tutorial or invited) v- Title of proposed paper - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author - Topics which best describe the paper (max. 5 keywords) - Short CV of authors (recommended) Contributions are welcome from those working in industry and having experience in the topics of this conference as well as from academics. The conference language is English. It is strongly recommended to submit abstracts by electronic mail to icsc at icsc.ch or else by fax or mail (2 copies) to the following address: ICSC Switzerland P.O. Box 657 CH-8055 Zurich Switzerland Fax: +41-1-761-9627 ************************************************* BEST PRESENTATION AWARDS The best oral and poster presentations will be honored with best presentation awards. ************************************************* PUBLICATIONS Conference proceedings (including all accepted papers) will be published by ICSC Academic Press and be available for the delegates at the symposium in printed form or on CD-ROM. Authors of a selected number of innovative papers will be invited to submit extended manuscripts for publication in prestigious international journals. ************************************************* v IMPORTANT DATES - Submission Deadline: October 31, 1999 - Notification of Acceptance: December 31, 1999 - Delivery of full papers: February 15, 2000 - Tutorials and Workshops: May 23, 2000 - NC'2000 Symposium: May 24-26, 2000 ************************************************* ACCOMMODATION Accommodation at reasonable rates will be available at nearby hotels. Full details will follow with the letters of acceptance. ************************************************* SOCIAL AND TOURIST ACTIVITIES A social program will be organized and also be available for accompanying persons. ************************************************* THE FASCINATING CITY OF BERLIN The old and new capital of Germany is a mecca for scientists and cultural enthusiasts, for day workers and night owls. Charming with its several opera houses, concert halls, cabarets, and beer gardens, Berlin is full of spontaneous cultural events or happenings. As an urban building site for the future, it is at the same time a living contradiction and a casual place with relaxation areas and large parks right in the city center. The fine nature, pine tree forests, and '1001' lakes around the city supply Berlin with it's very specific 'sparkling air' in spring time. No other city in Germany has during the last 100 years played such a prominent role in history and in the imagination of the people: social and industrial revolution, world war I and later manifestation of the first German republic, the 'Golden Twenties', nazis dictatorship and world war II, splitting by the Berlin Wall, 'economic miracle' in the west and socialistic showpiece city in the east, '68 student and alternative lifestyle movement in the west and peace movement in the east, and finally, the fall of the Wall. Berlin tempts with its many research facilities, and it is Germany's largest industrial city with headquarters or dependences of most major vcompanies. At present, approximately 3.5 million inhabitants live in the reunified city, among which more than 120.000 students, who study in three universities and twelve colleges or schools of arts. Berlin ist eine Reise wert. Welcome! ************************************************* FURTHER INFORMATION Fully updated information is available from http://www.icsc.ab.ca/nc2000.htm You may also contact - ICSC International Computer Science Conventions P.O. Box 657, CH-8055 Zurich, Switzerland Email: icsc at icsc.ch Phone: +41-878-888-150 Fax: +41-1-761-9627 or, for specific scientific requests, the symposium chairman. From brunel at asterix.ccs.brandeis.edu Tue Jul 6 16:58:49 1999 From: brunel at asterix.ccs.brandeis.edu (Nicolas Brunel) Date: Tue, 6 Jul 1999 16:58:49 -0400 (EDT) Subject: papers available Message-ID: Dear connectionists, The following two papers on the analysis of the dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons are now available on my web page: http://mumitroll.ccs.brandeis.edu/‾brunel/journal.html "Fast global oscillations in networks of integrate-and-fire neurons with low firing rates" N Brunel and V Hakim to appear in Neural Computation, 11, 1621-1671 (1999) Abstract: We study analytically the dynamics of a network of sparsely connected inhibitory integrate-and-fire neurons in a regime where individual neurons emit spikes irregularly and at a low rate. In the limit when the number of neurons $N¥rightarrow¥infty$, the network exhibits a sharp transition between a stationary and an oscillatory global activity regime where neurons are weakly synchronized. The activity becomes oscillatory when the inhibitory feedback is strong enough. The period of the global oscillation is found to be mainly controlled by synaptic times, but depends also on the characteristics of the external input. In large but finite networks, the analysis shows that global oscillations of finite coherence time generically exist both above and below the critical inhibition threshold. Their characteristics are determined as functions of systems parameters, in these two different regimes. The results are found to be in good agreement with numerical simulations. "Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons" N Brunel to appear in Journal of Computational Neuroscience Abstract: The dynamics of networks of sparsely connected excitatory and inhibitory integrate-and-fire neurons is studied analytically. The analysis reveals a very rich repertoire of states, including: Synchronous states in which neurons fire regularly; Asynchronous states with stationary global activity and very irregular individual cell activity; States in which the global activity oscillates but individual cells fire irregularly, typically at rates lower than the global oscillation frequency. The network can switch between these states, provided the external frequency, or the balance between excitation and inhibition, is varied. Two types of network oscillations are observed: In the `fast' oscillatory state, the network frequency is almost fully controlled by the synaptic time scale. In the `slow' oscillatory state, the network frequency depends mostly on the membrane time constant. Finite size effects in the asynchronous state are also discussed. Nicolas Brunel Volen Center for Complex Systems, MS 013, Brandeis University 415 South Street, Waltham, MA 02254-9110, USA Tel (781) 736 2890 -- Fax (781) 736 4877 Email brunel at asterix.ccs.brandeis.edu http://mumitroll.ccs.brandeis.edu/‾brunel From rinehart at scf-fs.usc.edu Tue Jul 6 16:57:13 1999 From: rinehart at scf-fs.usc.edu (John Rinehart) Date: Tue, 06 Jul 1999 13:57:13 -0700 Subject: conference: Replacement Parts for the Brain: Intracranial Implantation of Hardware Models of Neural Circuitry Message-ID: <3.0.3.32.19990706135713.007a83c0@scf.usc.edu> Replacement Parts for the Brain: Intracranial Implantation of Hardware Models of Neural Circuitry An NIMH, USC-AMI Sponsored Conference August 12-14 1999, Willard Hotel Washington D.C. This conference will bring together leading researchers throughout the country for a focus on one of the newest frontiers of neuroscientific and bioengineering research: the intracranial implantation of computer chip models of brain function as neural prosthetics to replace damaged or dysfunctional brain tissue. In considering the development of "replacement parts for the brain", speakers will address recent advances in (i) biologically realistic mathematical models of brain or spinal cord function, (ii) silicon- and/or photonics-based computational devices which incorporate those models, and (ii) "neuron-silicon interface" devices, i.e., micron-scale multi-site electrode arrays to provide bi-directional communication between the computational element and functioning neuronal tissue. The conference is intended to synergize interdisciplinary research in the neural, engineering, and biomedical sciences. For more information regarding the conference and meeting registration, please click on the conference web site address: http://www.usc.edu/dept/biomed/NeuralProstheticsCNF Conference Organizers: Theodore W. Berger, Ph.D. Director, Center for Neural Engineering 500 Olin Hall Department of Biomedical Engineering University of Southern California Los Angeles, CA 90089-1450 telephone: 213-740-8017 fax: 213-740-0343 e-mail: berger at bmsrs.usc.edu Dennis Glanzman, Ph.D. Chief, Theoretical Neuroscience Program NIMH Room 11-102 5600 Fishers Lane Rockville, MD telephone: 301-443-1576 fax: 301-443-4822 e-mail: glanzman at helix.nih.gov From sbaluja at lycos.com Wed Jul 7 23:53:56 1999 From: sbaluja at lycos.com (sbaluja@lycos.com) Date: Wed, 7 Jul 1999 23:53:56 -0400 Subject: Paper: Memory-based Face Recognition Message-ID: <852567A8.0014E52E.00@pghmta2.mis.pgh.lycos.com> Paper: High-Performance Memory-based Face Recognition for Visitor Identification JPRC-Technical Report-1999-01 Authors: Terence Sim, Rahul Sukthankar, Matthew D. Mullin & Shumeet Baluja Available from: http://www.cs.cmu.edu/‾baluja/techreps.html & http://www.cs.cmu.edu/‾rahuls/pub/ Abstract: We show that a simple, memory-based technique for view-based face recognition, motivated by the real-world task of visitor identification, can outperform more sophisticated algorithms that use Principal Components Analysis (PCA) and neural networks. This technique is closely related to correlation templates; however, we show that the use of novel similarity measures greatly improves performance. We also show that augmenting the memory base with additional, synthetic face images results in further improvements in performance. Results of extensive empirical testing on two standard face recognition datasets are presented, and direct comparisons with published work show that our algorithm achieves comparable (or superior) results. This paper further demonstrates that our algorithm has desirable asymptotic computational and storage behavior, and is ideal for incremental training. Our system is incorporated into an automated visitor identification system that has been operating successfully in an outdoor environment for several months. Contact: tsim at jprc.com, rahuls at jprc.com, mdm at jprc.com, sbaluja at lycos.com Comments and Questions are welcome! From peterw at cogs.susx.ac.uk Thu Jul 8 06:10:34 1999 From: peterw at cogs.susx.ac.uk (Peter Williams) Date: Thu, 8 Jul 1999 11:10:34 +0100 Subject: Senior post in Neural Computation Message-ID: UNIVERSITY OF SUSSEX SCHOOL OF COGNITIVE AND COMPUTING SCIENCES SENIOR FACULTY POST - Neural Computation/Computer Vision (Ref 133) Applicants are invited for a permanent faculty position, up to Chair level, within the Computer Science and Artificial Intelligence Subject Group of the School of Cognitive and Computing Sciences. The expected start date is 1 October 1999 or as soon as possible thereafter. Candidates should be able to show evidence of significant research achievement in Neural Computation or Computer Vision. The successful applicant will be expected to expand significantly the existing high research profile of the Group in this area. Applicants may also be expected to contribute to teaching in more general areas of Computer Science or Artificial Intelligence. The level of the appointment, Professor/Reader/Senior Lecturer, will be appropriate to the achievements and potential of the successful candidate. Salary in the range: 30,496 - 34,464 per annum (Senior Lecturer/ Reader) or 35,170 minimum per annum (Professorial). Salaries under review. Informal enquiries may be made to Dr Peter Williams on Tel +44 1273 678756. Email peterw at cogs.susx.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk. Closing date: Friday 13 August 1999. Application forms and further particulars are available from and should be returned to Liz Showler, Staffing Services Office, Sussex House, University of Sussex, Falmer, Brighton BN1 9RH, UK. Tel +44 1273 877324 Email E.S.Showler at sussex.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk. Details of all posts can also be found via the website below. http://www.susx.ac.uk/Units/staffing An Equal Opportunity Employer From cmbishop at microsoft.com Fri Jul 9 04:52:22 1999 From: cmbishop at microsoft.com (Christopher Bishop) Date: Fri, 9 Jul 1999 01:52:22 -0700 Subject: New book: "Neural Networks and Machine Learning" Message-ID: <3FF8121C9B6DD111812100805F31FC0D101F241E@RED-MSG-59> New book: "Neural Networks and Machine Learning" Christopher M. Bishop (Ed.) Springer-Verlag (for information on how to order: see below) Contents: B. D. Ripley: " Statistical Principles of Model Fitting" L. Brieman: "Bias-Variance, Regularization, Instability and Stabilization" J. M. Buhman and N. Tishby: "Empirical Risk Optimization: A Statistical Learning Theory of Data Clustering" E. D. Sontag: "VC Dimension of Neural Networks" R. M. Neal: "Assessing Relevance Determination Methods using DELVE" D. J. C. MacKay: "Introduction to Gaussian Processes" H. Zhu, C. K. I. Williams, R. Rohwer and M. Morciniec: "Gaussian Regression and Optimal Finite Dimensional Linear Models" T. S. Jaakkola and M. I. Jordan: "Variational Methods and the QMR-DT Database" D. Barber and C. M. Bishop: "Ensemble Learning in Bayesian Neural Networks" V. Vapnik: "The Support Vector Method of Function Estimation" B. Sallans, G. E. Hinton and Z. Ghahramani: "A Hierarchical Community of Experts" E. B. Baum: "Manifesto for an Evolutionary Economics of Intelligence" Preface: From harnad at coglit.ecs.soton.ac.uk Fri Jul 9 12:47:34 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 9 Jul 1999 17:47:34 +0100 (BST) Subject: SPEECH RECOGNITION: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article MERGING INFORMATION IN SPEECH RECOGNITION: FEEDBACK IS NEVER NECESSARY by Norris D., McQueen J. M., Cutler A., *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL by July 21st to: bbs at cogsci.soton.ac.uk or write to Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection on the Web. _____________________________________________________________ MERGING INFORMATION IN SPEECH RECOGNITION: FEEDBACK IS NEVER NECESSARY Norris Dennis. Medical Research Council Cognition and Brain Sciences Unit, 15, Chaucer Rd., Cambridge, CB2 2EF, U.K. Dennis.Norris at mrc-cbu.cam.ac.uk http://www.mrc-cbu.cam.ac.uk/ James M. McQueen and Anne Cutler Max-Planck-Institute for Psycholinguistics, Wundtlaan 1, 6525 XD Nijmegen, The Netherlands James.McQueen at mpi.nl and Anne.Cutler at mpi.nl http://www.mpi.nl ABSTRACT: Top-down feedback does not benefit speech recognition; on the contrary, it can hinder it. No experimental data imply that feedback loops are required for speech recognition. Feedback is accordingly unnecessary and spoken word recognition is modular. To de fend this thesis we analyse lexical involvement in phonemic decision-making. TRACE (McClelland & Elman 1986), a model with feedback from the lexicon to prelexical processes, is unable to account for all the available data on phonemic decision-making. The modular Race model (Cutler & Norris 1979) is likewise challenged by some recent results however. We therefore present a new modular model of phonemic decision-making, the Merge model. In Merge, information flows from prelexical processes to the lexicon without feedback. Because phonemic decisions are based on the merging of prelexical and lexical information, Merge correctly predicts lexical involvement in phonemic decisions in both words and nonwords. Computer simulations show how Merge is able to account for the data through a process of competition between lexical hypotheses. We discuss the issue of feedback in other areas of language processing, and conclude that modular models are particularly well suited to the problems and constraints of speech recognition. KEYWORDS: feedback, modularity, phonemic decisions, lexical processing, computational modelling, word recognition, speech recognition, reading, ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web from the US or UK BBS Archive. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.norris.html ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/‾harnad/science.html Nature: http://www.cogsci.soton.ac.uk/‾harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/‾harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/‾harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From harnad at coglit.ecs.soton.ac.uk Fri Jul 9 13:00:58 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 9 Jul 1999 18:00:58 +0100 (BST) Subject: LOCALIST CONNECTIONISM: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article CONNECTIONIST MODELLING IN PSYCHOLOGY: A LOCALIST MANIFESTO by Mike Page *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL by July 21st to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection at BBS's Princeton or Southampton Website. _____________________________________________________________ CONNECTIONIST MODELLING IN PSYCHOLOGY: A LOCALIST MANIFESTO Mike Page Medical Research Council Cognition and Brain Sciences Unit, 15, Chaucer Rd., Cambridge, CB2 2EF, U.K. mike.page at mrc-cbu.cam.ac.uk http://www.mrc-cbu.cam.ac.uk/ ABSTRACT: Over the last decade, fully-distributed models have become dominant in connectionist psychological modelling, whereas the virtues of localist models have been underestimated. This target article illustrates some of the benefits of localist modelling. Localist models are characterized by the presence of localist representations rather than the absence of distributed representations. A generalized localist model is proposed that exhibits many of the properties of fully distributed models. It can be applied to a number of problems that are difficult for fully distributed models and its applicability can be extended through comparisons with a number of classic mathematical models of behaviour. There are reasons why localist models have been underused and thes e are addressed. In particular, many conclusions about connectionist representation, based on neuroscientific observation, are called into question. There are still some problems inherent in the application of fully distributed systems and some inadequacies in proposed solutions to these problems. In the domain of psychological modelling, localist modelling is to be preferred. KEYWORDS: connectionist modelling, neural networks, localist, distributed, competition, choice, reaction-time, consolidation. ___________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web at the US or UK BBS Archive. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.page.html ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/‾harnad/science.html Nature: http://www.cogsci.soton.ac.uk/‾harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/‾harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/‾harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From gerstner at lamisun1.epfl.ch Mon Jul 12 11:59:08 1999 From: gerstner at lamisun1.epfl.ch (Wulfram Gerstner) Date: Mon, 12 Jul 1999 17:59:08 +0200 (MET DST) Subject: PhD-studentship Message-ID: <199907121559.RAA09199@mantrasun8.epfl.ch> 2 PhD scholarships are available at the Swiss Federal Institute of Technology in Lausanne (EPFL) for thesis research on A) Information Theoretic Approaches to Spike Coding B) Kernels and Support Vector Machines for Vision The PhD-students will be part of the Center for Neuromimetic Systems, see http://diwww.epfl.ch/mantra Highly motivated candidates with a strong theory background (math,physics) and an interest in statistical learning theory, information theory, and machine learning should apply by sending their CV together with the name of a reference to wulfram.gerstner at di.epfl.ch. Subject line: application Unformated text or postscript format only. --------------------------------------------------------------------- Wulfram Gerstner Swiss Federal Institute of Technolgy Lausanne Assistant Professor Centre for Neuro-mimetic Systems Computer Science Department, EPFL, IN-J, 032 1015 Lausanne EPFL Tel. +41-21-693 6713 wulfram.gerstner at di.epfl.ch Fax. +41-21-693 5263 http://diwww.epfl.ch/mantra --------------------------------------------------------------------- From moris at ims.u-tokyo.ac.jp Mon Jul 12 23:31:16 1999 From: moris at ims.u-tokyo.ac.jp (Shinichi Morishita) Date: Tue, 13 Jul 1999 12:31:16 +0900 Subject: Call for Papers: PAKDD2000 Message-ID: <01BECD2B.9A4EDCE0@moris-dh223.ims.u-tokyo.ac.jp> ======================================================================== CALL FOR PAPERS: PAKDD-2000 ======================================================================== The Fourth Pacific-Asia Conference on Knowledge Discovery and Data Mining Kyoto, Japan April 18-20, 2000 Papers Due: October 10, 1999 Sponsored by: (to be confirmed) Japanese Society of Artificial Intelligence SIG-KBS (Knowledge Base Systems) SIG-FAI (Fundamental AI) The Institute of Electronics, Information and Communication Engineers ?SIG-DE (Data Engineering), SIG-AI (Artificial Intelligence) Japan Society for Software Science and Technology ?SIG-DM (Data Mining) Information Processing Society of Japan ?SIG-DB (Data Base) SIC-ICS (Intelligent & Complex Systems) ACM SIG-MOD Japan Keihanna Interaction Plaza, Inc. The Fourth Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD-2000) will provide an international forum for the sharing of original research results and practical development experiences among researchers and application developers from different KDD related areas such as machine learning, databases, statistics, knowledge acquisition, data visualization, knowledge-based systems, soft computing, and high performance computing. It will follow the success of PAKDD-97 held in Singapore in 1997, PAKDD-98 held in Australia in 1998, and PAKDD-99 held in China in 1999 by bringing together participants from universities, industry and government. Papers on all aspects of knowledge discovery and data mining are welcome. Areas of interest include, but are not limited to: - Theory and Foundational Issues in KDD * Data and Knowledge Representation * Logic for/of Knowledge Discovery * Expanding the Autonomy of Machine Discovers * Human Factors * Scientific Discovery * New Theory, Philosophy, and Methodology - KDD Algorithms and Methods * Machine Learning Methods * Statistical Methods * Heuristic Search * Inductive Logic Programming * Deduction, Induction and Abduction * Discovery of Exceptions and Deviations * Multi-criteria Evaluation and Data Mining Metrics * Hybrid and Multi-agent Methods * Evaluation of Complexity, Efficiency, and Scalability of Algorithms - Process-Centric KDD * Models and Framework of the Knowledge Discovery Process * Data and Dimensionality Reduction * Preprocessing and Postprocessing * Interestingness Checking of Data and Rules * Management and Refinement for the Discovered Knowledge * Decomposition of Large Data Sets * Discretisation of Continuous Data * Data and Knowledge Visualization * Role of Domain Knowledge and Reuse of Discovered Knowledge * KDD Process and Human Interaction - Soft Computing for KDD * Information Granulation and Granular Computing * Rough Sets in Data Mining * Neural Networks, Probabilistic Reasoning * Noise Handling and Uncertainty Management * Hybrid Symbolic/Connectionist KDD Systems - High Performance Data Mining and Applications * Multi-Database Mining * Data Mining in Advanced Databases (OODB, Spatial DB, Multimedia DB) * Database Reverse Engineering * Integration of Data Warehousing, OLAP and Data Mining * Combining Data Mining with Database Querying * Parallel and Distributed Data Mining * Data Mining on the Internet * Multi-agent, Multi-task KDD Systems * Data Mining from Unstructured and Multimedia Data * Unification of Data Mining with Intelligent Information Retrieval * Security and Privacy Issues * Successful/Innovative KDD Applications in Science, Engineering, Medicine, Business, Education, Government, and Industry Both research and applications papers are solicited. All submitted papers will be reviewed on the basis of technical quality, relevance to KDD, originality, significance, and clarity. Accepted papers are expected to be published in the conference proceedings by Springer-Verlag in the Lecture Notes in Artificial Intelligence series. A selected number of PAKDD-2000 accepted papers will be expanded and revised for inclusion in major Japanese and/or international journals. Candidates include "Knowledge and Information Systems: An International Journal" by Springer-Verlag (http://kais.mines.edu/?kais/). PAKDD Best Paper Award will be conferred on the authors of the best paper at the conference. The winner will be honored US$500 and free registration. Authors are invited to email postscript files of their papers to terano at gssm.otsuka.tsukuba.ac.jp or to send Four copies of them to Prof. Takao Terano (PAKDD-2000) Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan Tel.: +81-3-3942-6855 Fax.: +81-3-3942-6829 Email: terano at gssm.otsuka.tsukuba.ac.jp Electronic submission is highly preferred. Papers must be received by October 10, 1999. Notification of acceptance will be emailed to the first (or designated) author by December 15, 1999. Camera-ready copy of accepted papers will be due January 15, 2000. Format: The paper should consist of a cover page with title, authors' names, postal and e-mail addresses, an approximately 200 word summary, up to 5 keywords and a body not longer than ten pages with a single space. It is recommended that the authors use the style file of Springer-Verlag (http://www.springer.de/comp/lncs/authors.html) to minimize the possible conflict of paper length when preparing the camera ready. PAKDD Steering Committee: ========================= Xindong Wu, Colorado School of Mines, USA (Chair) Hongjun Lu, National University of Singapore (Co-Chair) Rao Kotagiri, University of Melbourne, Australia Huan Liu, National University of Singapore Hiroshi Motoda, Osaka University, Japan Lizhu Zhou, Tsinghua University, China Ning Zhong, Yamaguchi University, Japan Conference Chairs: ================== Masaru Kitsuregawa, University of Tokyo, Japan Hiroshi Motoda, Osaka University, Japan Program Chairs: =============== Takao Terano, Tsukuba University, Japan (Chair) Arbee L. P. Chen, National Tsing Hua University, Taiwan (Co-chair) Huan Liu, National University of Singapore, Singapore (Co-chair) Publicity Chair: =============== Shinichi Morishita, University of Tokyo, Japan Workshop Chair: =============== Takahira Yamaguchi, Shizuoka University, Japan Tutorial Chair: =============== Shusaku Tsumoto, Shimane Medical University, Japan Local Organizing Committee Chair: ============================ Shiro Takata, Keihanna Interaction Plaza, Inc., Japan Program Committee: (To be announced) =============== Further Information: ==================== Takao Terano (PAKDD-2000) Graduate School of Systems Management, The University of Tsukuba, Tokyo 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112, Japan Tel.: +81-3-3942-6855 Fax.: +81-3-3942-6829 Email :terano at gssm.otsuka.tsukuba.ac.jp --------------------------------------------------------------- Call for Tutorial Proposals ??????????????????????????? PAKDD-2000 will offer a tutorial program on KDD topics. We would be able to present only a limited number of tutorials, and the selection would be guided by the perceived quality and relevance to the conference. If you are interested in giving a tutorial, please send a proposal to tsumoto at computer.org by Oct. 10, 1999. Call for Workshop Proposals ??????????????????????????? PAKDD-2000 will provide a venue for a few workshops to focus on advanced research areas of KDD. Please submit suggestions for workshop proposals to yamaguti at cs.inf.shizuoka.ac.jp by Oct. 10, 1999. Call for Panel Proposals ???????????????????????? Proposals are sought for panels that stimulate interaction between the communities contributing to KDD. Include title, the main goals, prospective participants and a summary of the topics to be discussed. Please email panel proposals to terano at gssm.otsuka.tsukuba.ac.jp by Oct. 10, 1999. Call for Exhibits and Industry Sessions ??????????????????????????????????????? PAKDD-2000 will organize a data mining, OLAP and data warehouse product exhibition. Please send proposals for exhibits to terano at gssm.otsuka.tsukuba.ac.jp by Oct. 10, 1999. -------------------------------------------------------------------- From tom at erato.atr.co.jp Tue Jul 13 06:51:12 1999 From: tom at erato.atr.co.jp (Tomohiro Shibata) Date: Tue, 13 Jul 1999 19:51:12 +0900 Subject: Humanoid Robot Demonstration Message-ID: <19990713195112G.tom@erato.atr.co.jp> The home page presenting information of a humanoid robot that we developed is available online at http://www.erato.atr.co.jp/DB/ The following draft is what we used for the press release on June 24, 1999. ------- Subject: Humanoid Robot Demonstration Lift the ban on this topic: June 24, 1999, 13:00pm Abstract: The Kawato Dynamic Brain Project, ERATO, JST introduces the HUMANOID ROBOT, a dextrous anthropomorphic robot that has the same kinematic structure as the human body with 30 active degrees of freedom (without fingers). We believe that employing a HUMANOID ROBOT is the first step towards a complete understanding of high-level functions of the brain by mathematical analysis. For demonstration purposes, the HUMANOID ROBOT performs the Okinawa folk dance "Kacha-shi" and learns human-like eye movements based on neurobiological theories. It is noteworthy that the acquisition of the Okinawa folk dance was achieved based on "learning from demonstration", which is in sharp contrast to the classic approach of manual robot programming. Learning from demonstration means learning by watching a demonstration of a teacher performing the task. In our approach to learning from demonstration, a reward function is learned from the demonstration, together with a task model that can be acquired from the repeated attempts to perform the task. Knowledge of the reward function and the task models allows the robot to compute an appropriate control mechanism. Over the last years, we have made significant progress in "learning from demonstration" such that we are able to apply the developed theories to the HUMANOID ROBOT. We believe that learning from demonstration will provide one of the most important footholds to understand the information processes of sensori-motor control and learning in the brain. We believe that the following three levels are essential for a complete understanding of brain functions: (a) hardware level; (b) information representation and algorithms; and (c) computational theory. We are studying high-level functions of the brain by utilizing multiple methods such as neurophysiological analysis of the Basal Ganglia and Cerebellum; psychophysical and behavioral analysis of visual motor learning; brain activity by fMRI study; mathematical analysis; computer simulation of neural networks, and robotics experiments using the HUMONOID ROBOT. For instance, in one of our approaches, we are trying to learn a Neural Network Model for Motor Learning with the HUMANOID ROBOT that includes data from psychophysical and behavioral experiments as well as data from brain activity from fMRI studies. The HUMANOID ROBOT reproduces a learned model in a real task, and we are able to verify the model by checking its robustness and performance. A lot of attention is being given on the study of brain functions using this new tool: the HUMANOID ROBOT. This should be a first important step towards changing the future of brain science.Date of press release: June 24, 1999 Organizer: JST Host Organizer: STA Style: Material along with lecture, videotape, and demonstration Place: Experiment room of the Kawato Dynamic Brain Project, Kyoto Chairman: Mitsuo Kawato, Project Director (one hour including questions.) -- SHIBATA, Tomohiro, Ph.D. | email: tom at erato.atr.co.jp Kawato Dynamic Brain Project | WWW: http://www.erato.atr.co.jp/‾tom ERATO | Robotics in Japan page: JST | http://www.erato.atr.co.jp/‾tom/jrobres.html From wray at EECS.Berkeley.EDU Tue Jul 13 21:36:01 1999 From: wray at EECS.Berkeley.EDU (Wray Buntine) Date: Tue, 13 Jul 1999 18:36:01 -0700 (PDT) Subject: research position on NASA-funded project, synthesis of learning alg. Message-ID: <199907140136.SAA21107@ic.EECS.Berkeley.EDU> Immediate opening for late summer position, full or half-time funded by NASA Ames Research Center for good masters, in-progress PhD or postdoc. American citizens or resident aliens only can apply. Introduction to the research for the project can be found in the groups pending KDD'99 paper available at: http://www-cad.eecs.berkeley.edu/‾wray/kdd99.ps and relevance of broader research goals at: http://www-cad.eecs.berkeley.edu/‾wray/Mirror/x2009.pdf This is a unique opportunity to work with leading experts in both data analysis and automated software engineering http://ic.arc.nasa.gov/ic/projects/amphion/index.html developing NASA's future intelligent systems. Position is short term and has potential for growth, and ideal for student seeking late summer work. The Task ======== Applicant should have knowledge of statistical algorithms for data analysis, and familiarity with their application and their implementation. Applicant will support NASA Ames' automated software engineering experts in their development of the system, and in testing the system, and co-develop the system with the data analysis expert. The System ========== Code synthesis is routinely used in industry to generate GUIs, for database support, and in computer-aided design, and has been successfully used in development projects for scheduling. Our research applies synthesis to statistical, data anlysis algorithms. For synthesis, we use a specification language that generalizes Bayesian networks, a dependency model on variables. Using decomposition methods and algorithm templates, our system transforms the network through several levels of representation into pseudo-code which can be translated into the implementation language of choice. Algorithm templates are been developed for a variety of sophisticated schemes including EM, mean-field, maximum aposterior, and iterative-reweighted least squares. The transformation system is based on a term-rewriting core with the capacity for symbolic simplification and differentiation of sums and products over vectors, matrices and delta functions. We are currently developing a back-end optimizer for Matlab. Requirements ============ Applicant should be a keen and experienced coder with a knowledge of the relevant data analysis algorithms and their implementation. Code is written in a mixture of Prolog (for the synthesis) and Lisp (for the back-end), and basic exposure to these languages is necessary. The back-end is Matlab so exposure here is useful too. Development on a Linux platform. Applicant should be willing to learn new languages/environments. Please contact Wray Buntine for further information at: wray at ic.eecs.berkeley.edu Applications ============ Email your resume or a URL to Wray Buntine at the above address. Microsoft Word documents definitely discouraged. No deadline but sooner better! From wahba at stat.wisc.edu Tue Jul 13 20:22:17 1999 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 13 Jul 1999 19:22:17 -0500 (CDT) Subject: gss software Message-ID: <199907140022.TAA24210@hera.stat.wisc.edu> The following software may be of interest to connectionists. It is for use with R, which is a public statistical package with many utilities of use to statisticians. The main R master site is http://www.ci.tuwien.ac.at/R/, a US mirror site is http://cran.stat.wisc.edu/, and the packages, including gss below can be found at http://www.ci.tuwien.ac.at/R/src/contrib/PACKAGES.html ............................ ------- Start of forwarded message ------- From chong at stat.purdue.edu Tue Jul 6 12:40:14 1999 From: chong at stat.purdue.edu (Chong Gu) Date: Tue, 6 Jul 1999 11:40:14 -0500 Subject: New smoothing spline package gss Message-ID: Dear fellow R users, I just uploaded a new package gss to ftp.ci.tuwien.ac.at. The package name gss stands for General Smoothing Spline. In the current version (0.4-1), it handles nonparametric multivariate regression with Gaussian, Binomial, Poisson, Gamma, Inverse Gaussian, and Negative Binomial responses. I am still working on code for density estimation and hazard rate estimation to be made available in future releases. On the modeling side, gss uses tensor-product smoothing splines to construct nonparametric ANOVA structures using cubic spline, linear spline, and thin-plate spline marginals. The popular (main-effect-only) additive models are special cases of nonparametric ANOVA models. The syntax of gss functions resembles that of the lm and glm suites. Among new features that are not available from other spline packages are the standard errors needed for the construction of Wahba's Bayesian confidence intervals for smoothing spline fits, so you may want to try out gss even if you only wants to calculate a univariate cubic spline or a single term thin-plate spline. For those familiar with smoothing splines, gss is a front end to RKPACK, which encodes O(n^3) generic algorithms for reproducing kernel based smoothing spline calculation. Reports on bugs and suggestions for improvements/new features are most welcome. Chong Gu -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-announce mailing list -- Read http://www.ci.tuwien.ac.at/‾hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-announce-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._ From harnad at coglit.ecs.soton.ac.uk Wed Jul 14 13:24:52 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Wed, 14 Jul 1999 18:24:52 +0100 (BST) Subject: Neural Organization: BBS call for Multiple Review Message-ID: Below is the abstract of the Precis of a book that will shortly be circulated for Multiple Book Review in Behavioral and Brain Sciences (BBS): *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** PRECIS FOR Structure, Function, and Dynamics: An Integrated Approach to Neural Organization :BBS MULTIPLE BOOK REVIEW by Michael Arbib, Peter Erdi and John Szentagothai This book has been accepted for a muliple book review to be published in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be BBS Associates or nominated by a BBS Associate. (All prior BBS referees, editors, authors, and commentators are also equivalent to Associates.) To be considered as a reviewer for this article, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please send EMAIL to, BEFORE August 13, 1999: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of reviewers, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a reviewer. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. Please also specify 1) If you need the book 2) whether you can make it by the deadline of October 15, 1999. Please note that it is the book, not the Precis, that is to be reviewed. It would be helpful if you indicated in your reply whether you already have the book or would require a copy. _____________________________________________________________ PRECIS OF: Structure, Function, and Dynamics: An Integrated Approach to Neural Organization BBS MULTIPLE BOOK REVIEW Michael Arbib Director, USC Brain Project, University of Southern California Los Angeles, CA 90089-2520 USA. Arbib at pollux.usc.edu Peter Erdi Head, Dept. Biophysics KFKI Research Institute for Particle and Nuclear Physics of the Hungarian Academy of Sciences H-1525 Budapest, P.O. Box 49, Hungary. erdi at rmki.kfki.hu ABSTRACT: "Neural Organization: Structure, Function, and Dynamics" (Arbib, Erdi, and Szentagothai, 1997, Cambridge, MA: The MIT Press; henceforth Organization) shows how theory and experiment can supplement each other in an integrated, evolving account of structure, function, and dynamics. New data lead to new models; new models suggest the design of new experiments. Much of modern neuroscience seems excessively reductionist, focusing on the study of ever smaller microsystems with little appreciation of their contribution to the behaving organism. We welcome these new data but are concerned to restore some equilibrium between systems, cellular, and molecular neuroscience. After a brief tribute to our late colleague John Szentagothai, we trace the threads of Structure, Function and Dynamics as they weave through the book, thus providing a broad general framework for the integration of computational and empirical neuroscience. Part II of Organization presents a structural analysis of various brain regions - olfactory bulb and cortex, hippocampus, cerebral cortex, cerebellum, and basal ganglia - as prelude to our account of the dynamics of the neural circuits and function of each region. To exemplify this approach, this prcis analyzes the hippocampus in anatomical, dynamical, and functional terms. We conclude by pointing the way to the use of our methodology in the development of Cognitive Neuroscience. KEYWORDS: neural organization, dynamics, Szentgothai, computational neuroscience, neural modeling, modular architectonics, neural plasticity, hippocampus, rhythmogenesis, cognitive maps, memory. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.arbib.html ------------------------------------------------------------------ *** FIVE IMPORTANT ANNOUNCEMENTS *** (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/‾harnad/science.html Nature: http://www.cogsci.soton.ac.uk/‾harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/‾harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/‾harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From hali at theophys.kth.se Wed Jul 14 19:01:22 1999 From: hali at theophys.kth.se (Hans Liljenstrm) Date: Thu, 15 Jul 1999 01:01:22 +0200 Subject: Registration for Agora'99 Message-ID: <378D16C1.403F7A06@theophys.kth.se> ***************************************************************************** REGISTRATION INFORMATION for 1999 Agora Meeting on Fluctuations in Biological Systems (Agora'99) August 3-7, 1999, Sigtuna, Sweden Information and registration at http://www.theophys.kth.se/‾hali/agora/agora99 ***************************************************************************** SCOPE This interdiscplinary conference on fluctuations in biological systems will be held in the small old town of Sigtuna, Sweden, Aug 3-7, 1999, and is following upon a series of workshops, where the first was held in Sigtuna, Sep 4-9 1995 (Sigtuna Workshop 95). The approach on these meetings is theoretical as well as experimental, and the meetings are intended to attract participants from various fields, such as biology, physics, and computer science. MOTIVATION Life is normally associated with a high degree of order and organization. However, disorder ? in various contexts referred to as fluctuations, noise or chaos ? is also a crucial component of many biological processes. For example, in evolution random errors in the reproduction of the genetic material provides a variation that is fundamental for the selection of adaptive organisms. At a molecular level, thermal fluctuations govern the movements and functions of the macromolecules in the cell. Yet, it is also clear that too large a variation may have disastrous effects. Uncontrolled processes need stabilizing mechanisms. More knowledge of the stability requirements of biological processes is needed in order to better understand these problems, which also have important medical applications. Many diseases, for instance certain degenerations of brain cells, are caused by failure of the stabilizing mechanisms in the cell. Stability is also important and difficult to achieve in biotechnological applications. There is also randomness in structure and function of the neural networks of the brain. Spontaneous firing of neurons seems to be important for maintaining an adequate level of activity, but does this "neuronal noise" have anyother significance? What are the effects of errors and fluctuations in the information processing of the brain? Can these microscopic fluctuations be amplified to provide macroscopic effects? Often, one cannot easily determine whether an apparently random process is due to noise, governed by uncontrolled degrees of freedom, or if it is a result of "deterministic chaos". Would the difference be of any importance for biology? Especially, could chaos, which is characterized by sensitivity and divergence, be useful for any kind of information processing that normally depends upon stability and convergence? OBJECTIVE The objective of this meeting is to address questions and problems related to those above, for a deeper understanding of the effects of disorder in biological systems. Fluctuations and chaos have been extensively studied in physics, but to a much lesser degree in biology. Important concepts from physics, such as "noise-induced state transitions" and "controlled chaos" could also be of relevance for biological systems. Yet, little has been done about such applications and a more critical analysis of the positive and negative effects of disorder for living systems is needed. It is essential to make concrete and testable hypotheses, and to avoid the kind of superficial and more fashionable treatment that often dominates the field. By bringing together scientists with knowledge and insights from different disciplines we hope to shed more light on these problems, which we think are profound for understanding the phenomenon of life. TOPICS Topics include various aspects, experimental as well as theoretical, on fluctuations, noise and chaos, in biological systems at a microscopic (molecular), mesoscopic (cellular), and macroscopic (network and systems) level. Contributions are welcome regarding, among others, the following topics: - Biological signals and noise - Neural information processing - Synaptic fluctuations - Spontaneous neural firing - Macromolecular dynamics - Dynamics of microtubuli - Ion channel kinetics - Cell motility - Medical implications INVITED SPEAKERS Sergey Bezrukov, National Institute of Health, Bethesda, USA Hans Braun, Dept. of Physiology, University of Marburg, Germany Anders Ehrenberg, Dept. of Biophysics, Stockholm University, Sweden Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA Louis De Felice, Vanderbilt University, USA Hermann Haken, Institute of Theoretical Physics and Synergetics, Univ. of Stuttgart, Germany Uno Lindberg, Dept. of Cell Biology, Stockholm University, Sweden Matsuno, Department of Biomedical Engineering, Nagaoka Univ. of Technology, Japan Frank Moss, Dept. of Physics, University of Missouri, St Louis, USA Erik Mosekilde, Dept of Physics, Technical University of Denmark, Lyngby, Denmark Sakire P?gun, Center for Brain Research, Ege University, Turkey Rudolf Rigler, Dept. of Medical Biochemistry and Biophysics, Karolinska Institutet, Sweden Stephen Traynelis, Dept. of Pharmacology and Physiology, Emory University, Georgia, USA Horst Vogel, Swiss Federal Institute of Technology, Lausanne, Switzerland Jim J. Wright, Mental Health Research Institute of Victoria, Melbourne, Australia Michail Zhadin, Dept. of Physiology, Pushchino, Russia REGISTRATION and abstract submission can preferably be done via the Agora'99 home page: http://www.theophys.kth.se/‾per/form REGISTRATION FEES Regular: 2000 SEK Students: 1000 SEK FURTHER INFORMATION available from: Hans Liljenstrom or Peter Arhem Agora for Biosystems Box 57, SE-193 22 Sigtuna, Sweden Phone/Fax: +46-8-592 50901 Email: peter.arhem at neuro.ki.se, hali at theophys.kth.se WWW: http://www.theophys.kth.se/‾hali/agora From shastri at ICSI.Berkeley.EDU Wed Jul 14 21:55:34 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Wed, 14 Jul 1999 18:55:34 PDT Subject: Structured connectionism, temporal synchrony, and evidential reasoning Message-ID: <199907150155.SAA20814@lassi.ICSI.Berkeley.EDU> Dear Connectionists: The following paper may be of interest to you. It demonstrates that a fusion of structured connectionism, temporal synchrony, evidential reasoning, and fast synaptic potentiation can lead to a neural network capable of rapidly establishing causal and referential coherence. Best wishes. -- Lokendra Shastri --------------------------------------- http://www.icsi.berkeley.edu/‾shastri/psfiles/fusion99.ps OR http://www.icsi.berkeley.edu/‾shastri/psfiles/fusion99.pdf Title: Knowledge Fusion in the Large -- taking a cue from the brain Lokendra Shastri and Carter Wendelken International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 In Proceedings of the Second International Conference on Information Fusion, Sunnyvale, CA, July 1999, pp. 1262-1269. From rog1 at psu.edu Thu Jul 15 08:25:57 1999 From: rog1 at psu.edu (Rick Gilmore) Date: Thu, 15 Jul 1999 08:25:57 -0400 Subject: Senior-level faculty position in neuroscience Message-ID: SENIOR-LEVEL NEUROSCIENTIST. As part of a major university-wide expansion in the life sciences, the Department of Psychology at Penn State University announces a search in cognitive, computational, or behavioral neuroscience. Senior-level candidates who wish to play a leadership role in building Penn State's growing programs in neuroscience are encouraged to apply. Send a statement of research and teaching interests, vita, and recent reprints to: Neuroscience Search, 617 Moore Building, Department of Psychology, University Park, PA 16802. The Pennsylvania State University is an equal opportunity employer. From tlfine at ANISE.EE.CORNELL.EDU Thu Jul 15 20:09:42 1999 From: tlfine at ANISE.EE.CORNELL.EDU (Terrence Fine) Date: Thu, 15 Jul 1999 20:09:42 -0400 (EDT) Subject: new book, Feedforward Neural Network Methodology Message-ID: A new monograph, Feedforward Neural Network Methodology, by Terrence L. Fine, has appeared in the Springer Series on Statistics for Engineering and Information Science. It is priced at $69.95 for 340 pages. This monograph provides a thorough and coherent introduction to the mathematical properties of feedforward neural networks and to the computationally intensive methodology that has enabled their successful application to complex problems of pattern classification, forecasting, regression, and nonlinear systems modeling. Coherence is achieved by focusing on the class of feedforward neural networks, also called multilayer perceptrons, and orienting the discussion around the four questions: What functions can the network architecture implement or closely approximate? What is the complexity of an achievable implementation? Given the resources dictated by the complexity considerations, how do you select a network to achieve the task? How well will the selected network learn or generalize to new problem instances? Table of Contents Preface Chapter 1: Background and Organization Chapter 2: Perceptrons---Networks with a Single Node Chapter 3: Feedforward Networks I: Generalities and LTU Nodes Chapter 4: Feedforward Networks II: Real-valued Nodes Chapter 5: Algorithms for Designing Feedforward Networks Chapter 6: Architecture Selection and Penalty Terms Chapter 7: Generalization and Learning Appendix: A Note on Use as a Text References Index From achilles at uom.gr Fri Jul 16 04:50:54 1999 From: achilles at uom.gr (Achilles D. Zapranis) Date: Fri, 16 Jul 1999 11:50:54 +0300 Subject: new book Message-ID: <003d01becf68$51687060$cf535cc1@uom.gr> New book (monograph): "Principles of Neural Model Identification, Selection and Adequacy - With Applications to Financial Econometrics" Springer - Verlag ISBN 1-85233-139-9 Achilleas Zapranis and Apostolos-Paul Refenes Neural networks are receiving much attention because of their powerful universal approximation properties. They are essentially devices for non-parametric statistical inference, providing an elegant formalism for unifying different non-parametric paradigms, such as nearest neighbours, kernel smoothers, projection pursuit, etc. Neural networks have shown considerable successes in a variety of disciplines ranging from engineering, control, and financial modelling. However, a major weakness of neural modelling is the lack of established procedures for performing tests for misspecified models and tests of statistical significance for the various parameters that have been estimated. This is a serious disadvantage in applications where there is a strong culture for testing not only the predictive power of a model or the sensitivity of the dependent variable to changes in the inputs but also the statistical significance of the finding at a specified level of confidence. This is very important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. In this book we investigate a broad range of issues arising with relation to their use as non-parametric statistical tools, including controlling the bias and variance parts of the estimation error, eliminating parameter and explanatory-variable redundancy, assessing model adequacy and estimating sampling variability. Based upon the latest, most significant developments in estimation theory, model selection and the theory of misspecified models this book develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework and displays through a selected case study and examples the efficient use of neural networks for modelling complex financial phenomena. The majority of existing books on neural networks and their application to finance concentrate on some of intricate algorithmic aspects of neural networks, the bulk of which is irrelevant to practitioners in this field. They use terminology, which is incomprehensible to professional financial engineers, statisticians and econometricians who are the natural readership in this subject. Neural networks are essentially statistical devices for non-linear, non-parametric regression analysis, but most of the existing literature discuss neural networks as a form of artificial intelligence. In our opinion this work meets an urgent demand for a textbook illustrating how to use neural networks in real-life financial contexts and provide methodological guidelines on how to develop robust applications which work from a platform of statistical insight. Contents: 1 INTRODUCTION 1.1 Overview 1.2 Active Asset Management, Neural Networks and Risk 1.2.1 Factor Analysis 1.2.2 Estimating Returns 1.2.3 Portfolio Optimisation 1.3 Non-Parametric Estimation with Neural Networks 1.3.1 Sources of Specification Bias 1.3.2 Principles of Neural Models Identification 1.4 Overview of the Remaining Chapters 2 NEURAL MODEL IDENTIFICATION 2.1 Overview 2.2 Neural Model Selection 2.2.1 Model Specification 2.2.2 Fitness Criteria 2.2.3 Parameter Estimation Procedures 2.2.4 Consistency and the Bias-Variance Dilemma 2.3 Variable Significance Testing 2.3.1 Relevance Quantification 2.3.2 Sampling Variability Estimation 2.3.3 Hypothesis Testing 2.4 Model Adequacy Testing 2.5 Summary 3 Review of Current Practice in Neural Model Identification 3.1 Overview 3.2 Current Practice in Neural Model Selection 3.2.1 Regularisation 3.2.2 Topology-Modifying Algorithms 3.2.3 The Structural Risk Minimisation (SRM) Principle 3.2.4 The Minimum Description Length (MDL) Principle 3.2.5 The Maximum a-Posteriori Probability (MAP) Principle 3.2.6 The Minimum Prediction Risk (MPR) Principle 3.3 Variable Significance Testing 3.3.1 Common Relevance Criteria 3.3.1.1 Criteria based on the Derivative dy/dx 3.3.1.2 Alternative Criteria 3.3.1.3 Comparing between Different Relevance Criteria 3.3.2 Sampling Variability and Bias Estimation with Bootstrap 3.3.2.1 Pairs Bootstrap 3.3.2.2 Residuals Bootstrap 3.3.2.3 Bias Estimation 3.3.3 Hypothesis Tests for Variable Selection 3.4 Model Adequacy Testing : Misspecification Tests 3.5 Summary 4 NEURAL MODEL SELECTION : THE MINIMUM PREDICTION RISK PRINCIPLE 4.1 Overview 4.2 Algebraic Estimation of Prediction Risk 4.3 Estimation of Prediction Risk with Resampling Methods 4.3.1 The Bootstrap and Jack-knife Methods for Estimating Prediction Risk 4.3.2 Cross-Validatory Methods for Estimating Prediction Risk 4.4 Evaluation of Model Selection Procedures 4.4.1 Experimental Set-Up 4.4.2 Algebraic Estimates 4.4.3 Bootstrap Estimates 4.4.4 Discussion, 103 4.5 Summary 5 VARIABLE SIGNIFICANCE TESTING : A STATISTICAL APPROACH 5.1 Overview 5.2 Relevance Quantification 5.2.1 Sensitivity Criteria 5.2.2 Model-Fitness Sensitivity Criteria 5.2.2.1 The Effect on the Empirical Loss of a Small Perturbation of x 5.2.2.2 The Effect on the Empirical Loss of Replacing x by its Mean 5.2.2.3 Effect on the Coefficient of Determination of a Small Perturbation of x 5.3 Sampling Variability Estimation 5.3.1 Local Bootstrap for Neural Models 5.3.2 Stochastic Sampling from the Asymptotic Distribution of the Network=92s Parameters (Parametric Sampling) 5.3.3 Evaluation of Bootstrap Schemes for Sampling Variability Estimation 5.3.3.1 Example 1 : The Burning Ethanol Sample 5.3.3.2 Example 2 : Wahba=92s Function 5.3.3.3 Example 3 : Network Generated Data 5.4 Hypothesis Testing 5.4.1 Confidence Intervals 5.4.2 Evaluating the Effect of a Variable=92s Removal 5.4.3 Variable Selection with Backwards Elimination 5.5 Evaluation of Variable Significance Testing 5.6 Summary 6 MODEL ADEQUACY TESTING 6.1 Overview 6.2 Testing for Serial Correlation in the Residuals 6.2.1 The Correlogram 6.2.2 The Box-Pierce Q-Statistic 6.2.3 The Ljung-Box LB-Statistic 6.2.4 The Durbin-Watson Test 6.3 An F-test for Model Adequacy 6.4 Summary 7 NEURAL NETWORKS IN TACTICAL ASSET ALLOCATION 7.1 Overview 7.2 Quantitative Models for Tactical Asset Allocation 7.3 Data Pre-Processing 7.4 Forecasting the Equity Premium with Linear Models 7.4.1 Model Estimation 7.4.2 Model Adequacy Testing 7.4.2.1 Testing the Assumptions of Linear Regression 7.4.2.2 The Effect of Influential Observations 7.4.3 Variable Selection 7.5 Forecasting the Equity Premium with Neural Models 7.5.1 Model Selection and Adequacy Testing 7.5.2 Variable Selection 7.5.2.1 Relevance Quantification 7.5.2.2 Sampling Variability Estimation 7.5.2.3 Backwards Variable Elimination 7.6 Comparative Performance Evaluation 7.7 Summary 9 CONCLUSIONS Appendix I : Computing Network Derivatives Appendix II : Generating Random Deviates Bibliography --------------------------------------------------------------------------------------------- Dr Achilleas D. Zapranis University of Macedonia of Economic and Social Sciences 156 Egnatia St. PO BOX 1591, 540 06 Thessaloniki Greece Tel. 00-31-(0)31-891690, Fax 00-30-(0)31-844536 e-mail : achilles at macedonia.uom.gr -------------------------------------------------------------------------------------------- From axon at cortex.rutgers.edu Fri Jul 16 09:53:01 1999 From: axon at cortex.rutgers.edu (Ralph Siegel) Date: Fri, 16 Jul 1999 09:53:01 -0400 Subject: Postdoctoral positions - visual system Message-ID: <001801becf92$850ab980$2a660680@stp.rutgers.edu> Postdoctoral Trainee (two positions). Analysis of visual structure-from-motion in primates. Representation of optic flow and attention in temporal (STPa) and parietal lobes (7a) are being examined in the awake behaving monkey. These studies utilize single unit recording, optical recording, microdialysis techniques. Laboratory also has ongoing fMRI, human psychophysical studies and computational studies. Recent graduates who are changing fields from either cellular or computational neuroscience to behavioral and physiological studies are particularly encouraged to apply. Computer expertise useful, but not necessary. Superb experimental and computational facilities in a multi-disciplinary research center. NY-NJ Metro area. Contact: Ralph Siegel, Ph.D. Center for Molecular and Behavioral Neuroscience Rutgers, The State University 197 University Avenue Newark, NJ 07102 phone: 973-353-1080 x3261 fax: 973-353-1272 axon at cortex.rutgers.edu http://www.cmbn.rutgers.edu/cmbn/faculty/rsiegel.html Term: Beginning July 1, 1999 Salary: NIH levels Please send statement of research interests, curriculum vitae, and names of three references. From ted.carnevale at yale.edu Sat Jul 17 18:07:12 1999 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sat, 17 Jul 1999 18:07:12 -0400 Subject: URL for NEURON at Yale Message-ID: <3790FE90.7A332295@yale.edu> NEURON's WWW site at Yale has been reorganized. As a consequence, the sole valid URL for NEURON's home page at Yale is http://www.neuron.yale.edu ^^^^^^ The URL of the Duke site remains http://neuron.duke.edu as before. Please excuse me if you also receive this notice from some other source. --Ted From terry at salk.edu Tue Jul 20 17:19:27 1999 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 20 Jul 1999 14:19:27 -0700 (PDT) Subject: NEURAL COMPUTATION 11:6 Message-ID: <199907202119.OAA07748@helmholtz.salk.edu> Neural Computation - Contents - Volume 11, Number 6 - August 15, 1999 VIEW Seeing White: Qualia In The Context of Decoding Population Codes Sidney R. Lehky and Terrence J. Sejnowski NOTE Adaptive Calibration of Imaging Array Detectors Marco Budinich and Renato Frison LETTERS Modeling The Combination Of Motion, Stereo, And Vergence Angle Cures To Visual Depth I. Fine and Robert A. Jacobs Inferring The Features Of Recognition From Viewoint-Dependent Recognition Data Florin Cutzu and Michael Tarr Backprojections In The Cerebral Cortex: Implications For Memory Storage Alfonso Renart, Nestor Parga and Edmund T. Rolls The Relationship Between Synchronization Among Neuronal Populations and Their Overall Activity Levels D. Chawla, E. D. Lumer, K. J. Friston Fast Calculation Of Short-Term Depressed Synaptic Conductances Michele Giugliano, Marco Bove and Massimo Grattarola Algorithmic Stability And Sanity-Check Bounds For Leave-One-Out Cross-Validation Michael Kearns and Dana Ron Convergence Properties Of The Softassign Quadratic Assignment Algorithm Anand Rangarajan, Alan Yuille, and Eric Mjolsness Learning to Design Synergetic Computers with an Expanded Synmetric Diffusion Network Koji Okuhara, Shonji Osaki, and Masaaki Kijima ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From juergen at idsia.ch Mon Jul 19 10:24:48 1999 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Mon, 19 Jul 1999 16:24:48 +0200 Subject: predictability minimization Message-ID: <199907191424.QAA13716@ruebe.idsia.ch> Processing Images by Semi-Linear Predictability Minimization ------------------------------------------------------------ Nicol N. Schraudolph, Martin Eldracher & Juergen Schmidhuber IDSIA, Lugano, Switzerland www.idsia.ch Network: Computation in Neural Systems, 10(2):133-169, 1999 In the predictability minimization approach (Neural Computation, 4(6):863-879, 1992), input patterns are fed into a system consisting of adaptive, initially unstructured feature detectors. There are also adaptive predictors constantly trying to predict current feature detector outputs from other feature detector outputs. Simultaneously, however, the feature detectors try to become as unpredictable as possible, resulting in a co-evolution of predictors and feature detectors. This paper describes the implementation of a visual processing system trained by semi-linear predictability minimization, and presents many experiments that examine its response to artificial and real-world images. In particular, we observe that under a wide variety of conditions, predictability minimization results in the development of well-known visual feature detectors. ftp://ftp.idsia.ch/pub/juergen/pm.ps.gz Many additional papers now also available in postscript format: http://www.idsia.ch/‾juergen/onlinepub.html Nici & Martin & Juergen From michael at cs.unm.edu Thu Jul 22 09:59:12 1999 From: michael at cs.unm.edu (Zibulevsky Michael) Date: Thu, 22 Jul 1999 07:59:12 -0600 Subject: new paper: Blind Source Separation by Sparse Decomposition Message-ID: Announcing a new paper.... Title: Blind Source Separation by Sparse Decomposition Authors: Michael Zibulevsky and Barak A. Pearlmutter Abstract The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding (possibly overcomplete) signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more robust computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques. URL of the ps file: http://iew3.technion.ac.il:8080/‾mcib/ Contact: michael at cs.unm.edu, bap at cs.unm.edu From shs7 at cornell.edu Thu Jul 22 14:43:14 1999 From: shs7 at cornell.edu (Steven Strogatz) Date: Thu, 22 Jul 1999 14:43:14 -0400 Subject: visiting position in Nonlinear Systems, Cornell Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 2491 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/ea156b90/attachment-0001.bin From chella at unipa.it Fri Jul 23 06:20:06 1999 From: chella at unipa.it (Antonio Chella) Date: Fri, 23 Jul 1999 12:20:06 +0200 Subject: INTERNATIONAL SCHOOL ON NEURAL NETS <> Message-ID: <379841D4.CA914FDD@unipa.it> ============================================================= INTERNATIONAL SCHOOL ON NEURAL NETS <> 4th Course: Subsymbolic Computation in Artificial Intelligence ERICE-SICILY: October 24-31, 1999 ============================================================= MOTIVATIONS Autonomous intelligent agents that perform complex real world tasks must be able to build and process rich internal representations that allow them to effectively draw inferences, make decisions, and, in general, perform reasoning processes concerning their own tasks. Within the computational framework of artificial intelligence (AI) this problem has been faced in different ways. According to the classical, symbolic approach, internal representations are conceived in terms of linguistic structures, as expressions of a "language of thought". Other traditions developed approaches that are less linguistically oriented, and more biologically and anatomically motivated. It is the case of neural networks, and of self-organizing and evolutionary algorithms. Empirical results concerning natural intelligent systems suggest that such approaches are not fully incompatible, and that different kinds of representation may interact. Similarly, it can be argued that the design of artificial intelligent systems can take advantage from different kinds of interacting representations, that are suited for different tasks. In this perspective, theoretical frameworks and methodological techniques are needed, that allow to employ together in a principled way different kinds of representation. In particular, autonomous agents need to find the meaning for the symbols they use within their internal processes and in the interaction with the external world, thus overcoming the well-known symbol grounding problem. An information processing architecture for autonomous intelligent agents should exhibit processes that act on suitable intermediate levels, which are intermediary among sensory data, symbolic level, and actions. These processes could be defined in terms of subsymbolic computation paradigms, such as neural networks, self- organizing, and evolutionary algorithms. DIRECTOR OF THE COURSE: Salvatore Gaglio DIRECTOR OF THE SCHOOL: M.I.Jordan - M.Marinaro DIRECTOR OF THE CENTRE: A. Zichichi SCIENTIFIC SECRETARIAT: Edoardo Ardizzone, Antonio Chella, Marcello Frixione WEB PAGE OF THE COURSE: http://dijkstra.cere.pa.cnr.it/ScuolaErice/ ============================================================= SPONSORS + ASEIT Advanced School on Electronics and Information Technology + AI*IA Associazione Italiana per l'Intelligenza Artificiale (Italian Association for Artificial Intelligence) + CERE - CNR CEntro di Studio sulle Reti di Elaboratori (Center of Study on Computer Networks) + CITC Centro Interdipartimentale sulle Tecnologie della Conoscenza + Commission of the European Communities + IEEE Neural Network Council + IIASS Istituto Internazionale per gli Alti Studi Scientifici (International Institute for Advanced Scientific Studies) + Ministero Italiano della Pubblica Istruzione (Italian Ministry of Education) + MURST Ministero dell'Universita' e della Ricerca Scientifica e Tecnologica (Italian Ministry of University and Scientific Research) + CNR Consiglio Nazionale della Ricerca (Italian National Research Council) + Regione Siciliana (Sicilian Regional Government) + SIREN Societa' Italiana Reti Neuroniche (Italian Neural Networks Society) + University of Genoa + University of Palermo + University of Salermo ============================================================= PROGRAM FOUNDATIONS Introduction to Artificial Intelligence: Luigia Carlucci Aiello (University of Roma, "La Sapienza", Italy) Neural modelling of higher order cognitive processes: John Taylor (King's College, London, UK) Connectionist Models for Data Structures: Marco Gori (University of Siena, Italy) Neural Systems Engineering: Igor Aleksander (Imperial College, London, UK) ASEIT (Advanced School on Electronics and Information Technology) OPEN INTERNATIONAL WORKSHOP ON SUBSYMBOLIC TECHNIQUES AND ALGORITHMS Peter G=E4rdenfors (Lund University, Sweden) Igor Aleksander (Imperial College, London, UK)) Teuvo Kohonen (Helsinki University of Technology, Finland) John Taylor (King's College, London, UK) Ronald Arkin (Georgia Institute of Technology, USA) REPRESENTATION Conceptual Spaces: Peter G=E4rdenfors (Lund University, Sweden) Topological Self Organizing Maps: Teuvo Kohonen (Helsinki University of Technology, Finland) Symbolic representations: Luigia Carlucci Aiello (University of Roma, =ECLa Sapienza=EE, Italy) VISUAL PERCEPTION Evolutionary Processes for Artificial Perception: Giovanni Adorni, Stefano Cagnoni (University of Parma, Italy) Cognitive Architectures for Artificial Vision: Antonio Chella (Univ. of Palermo, Italy), Marcello Frixione (Univ. of Salerno, Italy), Salvatore Gaglio (Univ. of Palermo, Italy) Algorithms for Image Analysis: Vito Di Ges=F9 (Univ. of Palermo, Italy) ACTION Motion Maps: Pietro Morasso (University of Genoa, Italy) Interacting with the External World: Luc Steels (Free University of Brussels, Belgium) Reinforcement Learning in Autonomous Robots: Christian Balkenius (Lund University, Sweden) Behavior-Based Robotics: Ronald Arkin (Georgia Institute of Technology, USA) ============================================================= APPLICATIONS Interested candidates should send a letter to the Director of the Course: Professor Salvatore GAGLIO Dipartimento di Ingegneria Automatica e Informatica Universita' di Palermo Viale delle Scienze 90128 - PALERMO - ITALY Tel: ++39.091.238245 Fax: ++39.091.6529124 E-mail: gaglio at unipa.it They should specify: 1.date and place of birth, together with present nationality; 2.affiliation; 3.address, e-mail address. Please enclose a letter of recommendation from the group leader or the Director of the Institute or from a senior scientist. PLEASE NOTE Participants must arrive in Erice on October 24, not later than 5:00 pm. ============================================================= From pg at artificial-life.com Sun Jul 25 11:20:36 1999 From: pg at artificial-life.com (Paolo Gaudiano) Date: Sun, 25 Jul 1999 11:20:36 -0400 Subject: Call for Participation Message-ID: <000901bed6b1$4006de60$4ca771d1@kmdg.cove.com> First USA-Italy Conference on Applied Neural and Cognitive Sciences Boston, October 3-6, 1999 The Italian Ministry of Foreign Affairs, through the Embassy of Italy in Washington, has identified Applied Neural and Cognitive Sciences as an area of strategic interest for research and development in Italy over the next 10 to 20 years. UI-CANCS'99, which is sponsored jointly by the Embassy of Italy and by Artificial Life, Inc., compares and contrasts industry and research experiences in Italy and in the United States. UI-CANCS'99 focuses on smart sensors, intelligent robotics, and related technologies with strong potential for industrial, medical and other applications. Invited presentations will define the current state-of-the-art and future developments in these fields, they will compare and contrast scientific and technological resources in the USA and Italy, and they will try to identify the most promising and = effective sectors in Italian industries and universities to be developed and supported over the next 10 to 20 years. UI-CANCS'99 offers a unique opportunity to present and discuss a broad range of scientific and industrial projects within the context of applied neural and cognitive sciences. The Conference also aims at strengthening collaborations between the USA and Italy, and to develop new joint projects. We hope that this will be the first in a series of international meetings held regularly to monitor progress in these fields, to identify areas in need of further development support, and to offer a meeting point between Italian and American members of industry, academia and other organizations. For additional information please visit http://www.usa-italy.org or send e-mail to info at usa-italy.org. ---------------------------------------------------------------------- Conference dates: Sunday, October 3 through Tuesday, October 6, 1999. These dates include a welcoming reception on Sunday evening and an optional visit to local laboratories in the Boston area on Wednesday, October 6. Invited talks will be given all day Monday and Tuesday morning, while a round-table discussion will be held on Tuesday afternoon. Location: George Sherman Union, Boston University, 775 Commonwealth Avenue, Boston, Massachusetts, USA. Organizing Committee: Daniele Amati, SISSA Emilio Bizzi, MIT Paolo Gaudiano, Artificial Life, Inc. Alexander Tenenbaum, Embassy of Italy Official Language: English Presentations: All talks are by invitation. Registration: See registration form at end. ---------------------------------------------------------------------- Preliminary Program SUNDAY, OCTOBER 3, 1999 6.00-9.00pm Registration; Welcoming Reception (with full registration) MONDAY, OCTOBER 4, 1999 8.00-9.00am Registration; Breakfast (with full registration) 9.00-9.45am Opening Remarks 9.45am-1.00pm (with 15-min break at 11:15am) Session 1, Smart Sensors 1.00-3.45pm Lunch 3.45-7.00pm (with 15-min break at 5.15pm) Session 2, Intelligent Robotics and Software Agents=20 8.00pm Official dinner and plenary talk (with full registration) TUESDAY, OCTOBER 5, 1999 8.00-9.00am Breakfast (with full registration) 9.00am-12.15pm (with 15-min break at 11.15am) Session 3, Biomedical Applications 12.15-1.15pm Session 4.P1, Relationship Between Industry and Research 1.15-3.45pm Lunch 3.45-5.15pm Session 4.P2, Relationship Between Industry and Research 5.30-7.30pm Round-table discussion. WEDNESDAY, OCTOBER 6, 1999 9.00am-4.00pm Tour of local labs and companies (with paid registration) ---------------------------------------------------------------------- INVITED SPEAKERS Daniele Amati, International School for Advanced Studies, Trieste, Italy Emilio Bizzi, Massachusetts Institute of Technology, Cambridge, USA Nunzio Bonavita, ABB Elsag Bailey Hartmann & Braun S.p.A., Genova, Italy Paolo Dario, Scuola Superiore Sant'Anna, Pisa, Italy Danilo De Rossi, Universita` degli studi di Pisa, Italy Arnaldo D'Amico, Universita` degli Studi di Roma "Tor Vergata", Italy Marco Dorigo, Universite' Libre de Bruxelles, Belgium Paolo Gaudiano, Artificial Life, Inc., Boston, USA Helen Greiner, IS Robotics Corporate, Somerville, USA Stephen Grossberg, Boston University, USA Barbara Hayes-Roth, Stanford University and Extempo, Inc., Redwood City, = USA Pietro Morasso, Universita` degli studi di Genova, Italy Riccardo Parenti, Ansaldo Ricerche Automation Group, Genova, Italy Renato Nobili, Universita` degli Studi di Padova, Italy Stefano Nolfi, Centro Nazionale delle Ricerche, Roma, Italy Antonio Pedotti, Politecnico di Milano, Italy Tomaso Poggio, Massachusetts Institute of Technology, Cambridge, USA Eberhard Schoeneburg, Artificial Life, Inc., Boston, USA David Walt, Tufts University, Medford, USA John Wyatt, Massachusetts Institute of Technology, Cambridge, USA ---------------------------------------------------------------------- UI-CANCS'99 REGISTRATION FORM Fill out this form and e-mail it to info at usa-italy.org, or fax it to: UI-CANCS'99 Registration (+1) 508 624 6097 Seating will be limited, so please register as soon as possible=20 ..................................................................... First Name: Last Name: E-mail address: Affiliation: Telephone: Fax: Complete Mailing Address: Registration type (see notes 1-3 below): __Talks only (regular) US$100 __Talks only (student) US$30 __Full (regular) US$200 __Full (student) US$75 Credit card information: Card type (check one): __VISA __MC __AMEX Card number: Name on card: Expiration date: Optional information (see notes 4-5 below): * I plan to attend the optional lab tour on October 6 (Full registrants only): __Yes __No * I would like to be considered for partial support through=20 a travel grant: __Yes __No ..................................................................... NOTES: 1. "Talks only" includes attendance to all invited sessions, printed program, badge, and refreshments at all coffee breaks.=20 2. Full registration includes attendance to all invited sessions, printed program, badge, and refreshment at all coffee breaks, pre-conference evening reception on October 3, breakfast and lunch on October 4 and 5, Banquet and plenary talk on October 4, and optional tour of Boston-area labs on October 6.=20 3. Students (up to PhD level only) must attach letter from their advisor certifying their student status.=20 4. Lab tours take place on October 6. Space is limited so attendance cannot be guaranteed. Please check www.usa-italy.org for additional information as it becomes available. 5. Travel grants for partial support of travel expense might be available to a limited number of participants. Priority is given to students and to participants coming from abroad. If you check this option, you will receive e-mail with additional information.=20 6. For additional information please e-mail info at usa-italy.org or call 800-701-4741 (+1-508-624-5545 from outside the US). From ericwan at ece.ogi.edu Mon Jul 26 16:48:40 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Mon, 26 Jul 1999 13:48:40 -0700 Subject: postdoctoral positions available Message-ID: <379CC9A8.356BA77A@ece.ogi.edu> ***************** Post-Doctoral Research Associate **************** OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an an immediate opening for a motivated Ph.D. to participate in an interdisciplinary autonomous controls project. Initial appointment is for a one-year period, with the possibility for extension depending on performance and availability of funding. This is a joint project with the Electrical and Computer Engineering Dept, the Center for Coastal and Land-Margin Research (CCALMR, http://www.ccalmr.ogi.edu) and the Pacific Software Research Center (PacSoft, http://www.cse.ogi.edu/PacSoft/). SOFTWARE ENABLED CONTROL PROJECT OVERVIEW: With the increase in on-board computational resources, optimal nonlinear control can be derived from accurate real-time computational models of a vehicle and its physical environment. This multidisciplinary project couples advances in dynamic vehicle modeling, large-scale environmental forecasting, software engineering, and recent advances in nonlinear control. The objectives are to demonstrate rapid dynamic reconfiguration of control laws and the ability to perform complex maneuvers. Progress to date has focused on optimal neural control approaches for trajectory generation of surface vessels navigating in forecasted flow fields of the Columbia River Estuary. Future direction of the program will expand to aerospace environments, and may investigate a number of possible different control paradigms CANDIDATE QUALIFICATIONS: Candidates should have a Ph.D. with strong expertise in control theory, neural networks and reinforcement learning. ADDITIONAL INFORMATION: The successful candidate will work with Eric A. Wan (http://www.ece.ogi.edu/‾ericwan/) and his research group and collaborators. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI http://ece.ogi.edu/‾ericwan/ ********************************************************************* From mdorigo at ulb.ac.be Mon Jul 26 17:03:42 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Mon, 26 Jul 1999 23:03:42 +0200 Subject: Final Call for Participation and Abstracts: Fourth European Workshop on Reinforcement Learning Message-ID: Call for Abstracts: EWRL-4, Fourth European Workshop on Reinforcement Learning Lugano, Switzerland, October 29-30, 1999 (We apologize for duplicates of this email) Reinforcement learning (RL) is a growing research area. To build a European RL community and give visibility to the current situation on the old continent, we are running a now biennal series of workshops. EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy (1995), EWRL-3 in Rennes, France (1997). EWRL-4 will take place in Lugano, Switzerland (1999). The first morning will feature a plenary talk by Dario Floreano. The rest of the two-day workshop will be dedicated to presentations given by selected participants. Presentation length will be determined once we have some feedback on the number of participants. The number of participants will be limited. Access will be restricted to active RL researchers and their students. Please communicate as soon as possible, and in any case before end of July 1999, your intention to participate by means of the intention form attached below (e-mail preferred: ewrl at iridia.ulb.ac.be). Otherwise send intention forms to: Marco Dorigo IRIDIA, CP 194/6 Universite' Libre de Bruxelles Avenue Franklin Roosvelt 50 1050 Bruxelles Belgium TIMELINE: intention forms and one page abstracts should be emailed by AUGUST 15, 1999 to ewrl at iridia.ulb.ac.be Up-to-date information, including inscription fees, hotel information, etc., is maintained at: http://iridia.ulb.ac.be/‾ewrl/EWRL4/EWRL4.html The Organizing Committee Marco Dorigo and Hugues Bersini, IRIDIA, ULB, Brussels, Belgium, Luca M. Gambardella and Juergen Schmidhuber , IDSIA, Lugano, Switzerland, Marco Wiering, University of Amsterdam, The Netherlands. -------------------------------------------------------------------- INTENTION FORM (to be emailed by AUGUST 15, 1999, to ewrl at iridia.ulb.ac.be) Fourth European Workshop on Reinforcement Learning (EWRL-4) Lugano, Switzerland, October 29-30, 1999 Family Name: First Name: Institution: Address: Phone No.: Fax No.: E-mail: ____ I intend to participate without giving a presentation ____ I intend to participate and would like to give a presentation with the following title: ____ MAX one page abstract: From axon at cortex.rutgers.edu Tue Jul 27 12:42:11 1999 From: axon at cortex.rutgers.edu (Ralph Siegel) Date: Tue, 27 Jul 1999 12:42:11 -0400 Subject: Vision postdoctoral position - Rutgers Message-ID: <001101bed84e$f91b09a0$21b5e6a5@stp.rutgers.edu> Postdoctoral Position in Vision Research. A position for a postdoctoral trainee interested in studying visual perception is available in the Center for Molecular and Behavioral Neuroscience at Rutgers University. The collaboration concerns exploring the computations needed for spatial perception in the parietal lobe of behaving primates using single unit recordings, intrinsic optical imaging techniques, microdialysis and psychophysical methods. Reference to recent publications on this work are available on our Web site http://www.cmbn.rutgers.edu/cmbn/faculty/rsiegel.html. Interested candidates should contact Ralph Siegel, CMBN, Rutgers University, 197 University Avenue, Newark, NJ 07102, Phone: 973-353-1080 x3261. Fax: 973-353-1272. E-mail: axon at cortex.rutgers.edu From ericwan at ece.ogi.edu Tue Jul 27 14:30:15 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Tue, 27 Jul 1999 11:30:15 -0700 Subject: OGI - Ph.D Opening Message-ID: <379DFAB7.433C7A0E@ece.ogi.edu> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** CENTER FOR SPOKEN LANGUAGE UNDERSTANDING http://cslu.cse.ogi.edu/ OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an immediate opening for an outstanding student in its Electrical and Computer Engineering Ph.D program. Full stipend and tuition will be covered. The student will specifically work with Professor Eric A. Wan (http://www.ece.ogi.edu/‾ericwan/) on a number of projects relating to neural network learning and speech enhancement. QUALIFICATIONS: The candidate should have a strong background in signal processing with some prior knowledge of neural networks. A Masters Degree in Electrical Engineering is preferred. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI ********************************************************************* OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and PhD programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From ormoneit at stat.Stanford.EDU Tue Jul 27 20:33:00 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Tue, 27 Jul 1999 17:33:00 -0700 (PDT) Subject: New Paper on Neural Networks, Filtering, and Derivatives Pricing Message-ID: <199907280033.RAA17620@rgmiller.Stanford.EDU> Dear Colleagues, Please take notice of the following working paper, available online at http://www-stat.stanford.edu/‾ormoneit/clearn.ps . Best regards, Dirk ------------------------------------------------------------------ A REGULARIZATION APPROACH TO CONTINUOUS LEARNING WITH AN APPLICATION TO FINANCIAL DERIVATIVES PRICING by Dirk Ormoneit We consider the training of neural networks in cases where the nonlinear relationship of interest gradually changes over time. One possibility to deal with this problem is by regularization where a variation penalty is added to the usual mean squared error criterion. To learn the regularized network weights we suggest the Iterative Extended Kalman Filter (IEKF) as a learning rule, which may be derived from a Bayesian perspective on the regularization problem. A primary application of our algorithm is in financial derivatives pricing, where neural networks may be used to model the dependency of the derivatives' price on one or several underlying assets. After giving a brief introduction to the problem of derivatives pricing we present experiments with German stock index options data showing that a regularized neural network trained with the IEKF outperforms several benchmark models and alternative learning procedures. In particular, the performance may be greatly improved using a newly designed neural network architecture that accounts for no-arbitrage pricing restrictions. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/‾ormoneit/ From Jonathan.Baxter at anu.edu.au Tue Jul 27 22:03:58 1999 From: Jonathan.Baxter at anu.edu.au (Jonathan Baxter) Date: Wed, 28 Jul 1999 12:03:58 +1000 Subject: Paper Available On Reinforcement Learning Message-ID: <9907281209350V.27898@pasiphae.anu.edu.au> The following paper can be obtained from : http://wwwsyseng.anu.edu.au/‾jon/papers/drlalg.ps.gz Title: Direct Gradient-Based Reinforcement Learning: I. Gradient Estimation Algorithms Authors: Jonathan Baxter and Peter Bartlett Research School of Information Sciences and Engineering Australian National University Jonathan.Baxter at anu.edu.au, Peter.Bartlett at anu.edu.au Abstract: Despite their many empirical successes, approximate value-function based approaches to reinforcement learning suffer from a paucity of theoretical guarantees on the performance of the policy generated by the value-function. In this paper we pursue an alternative approach: first compute the gradient of the {¥em average reward} with respect to the parameters controlling the state transitions in a Markov chain (be they parameters of a class of approximate value functions generating a policy by some form of look-ahead, or parameters directly parameterizing a set of policies), and then use gradient ascent to generate a new set of parameters with increased average reward. We call this method ``direct'' reinforcement learning because we are not attempting to first find an accurate value-function from which to generate a policy, we are instead adjusting the parameters to directly improve the average reward. We present an algorithm for computing approximations to the gradient of the average reward from a single sample path of the underlying Markov chain. We show that the accuracy of these approximations depends on the relationship between the discount factor used by the algorithm and the mixing time of the Markov chain, and that the error can be made arbitrarily small by setting the discount factor suitably close to $1$. We extend this algorithm to the case of partially observable Markov decision processes controlled by stochastic policies. We prove that both algorithms converge with probability 1. From ray at hip.atr.co.jp Wed Jul 28 16:39:39 1999 From: ray at hip.atr.co.jp (Thomas S.Ray) Date: Wed, 28 Jul 99 16:39:39 JST Subject: ARTIFICIAL LIFE AND ROBOTICS (AROB 5th '00) Message-ID: <9907280739.AA02152@laplace.hip.atr.co.jp> We apologize if you receive this more than once ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ THE FIFTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 5th '00) For Human Welfare and Artificial Life Robotics Jan.26-Jan.28, 2000, Compal Hall, Oita, JAPAN OBJECTIVE is to start collaborative research on development of the theories of artificial life and complexity and their applications for robotics by making several research groups where the members are experts in their fields. The research on artificial life and complexity were begun recently in Japan and are expected to be applied in various fields. Especially the research groups will focus their attention on the development of the hardware of artificial brains using neurocomputers etc. Also, the research groups will develop fundamental research on applications of artificial life and complexity for robotics by close relationships between researchers. TOPICS Artificial Brain Research Artificial Intelligence Artificial Life Artificial Living Artificial Mind Research Brain Science Chaos Cognitive Science Complexity Computer Graphics Evolutionary Computations Fuzzy Control Genetic Algorithms Innovative Computations Intelligent Control and Modeling Micromachines Micro-Robot World Cup Soccer Tournament Mobile Vehicles Molecular Biology Neural Networks Neurocomputers Neurocomputing Technologies and its Application for Hardware Robotics Robust Virtual Engineering Virtual Reality DEADLINES September 15, 1999 Submission of a 400-600 word-long abstract of a paper in English( 3 copies ). October 15, 1999 Notification of acceptance of papers. November 15, 1999 Submission of the text of the paper (camera-ready manuscript). CONFERENCE LANGUAGE English HOMEPAGE http://arob.cc.oita-u.ac.jp/ PUBLICATION Accepted papers will be published in the Proceedngs of AROB and some of high quality papers in the Proc. will be requested to re-submit for the consideration of publication in a new international journal ARTIFICIAL LIFE AND ROBOTICS (Springer) and APPLIED MATHEMATICS AND COMPUTATION (North-Holland). Send to: AROB Secretariat c/o Sugisaka Laboratory Dept of Electrical and Electronic Engineering Oita University 700 Dannoharu, Oita 879-1192, JAPAN TEL +81-97-554-7841, FAX +81-97-554-7818 E-MAIL arobsecr at cc.oita-u.ac.jp HISTORY This symposium was founded in 1996 by the support of Science and International Affairs Bureau, Ministry of Education, Science, Sports, and Culture, Japanese Government. This symposium was held in 1996 (First), 1997 (Second), 1998 (Third), 1999 (Fourth) at B-Con Plaza, Beppu, Oita, Japan. The Fifth symposium will be held in January 26-28, 2000 at Compal Hall, Oita, Japan. This symposium invites you all to discuss development of new technologies concerning Artificial Life and Robotics based on simulation and hardware in twenty first century. ORGANIZED BY Organizing Committee of International Symposium on Artificial Life and Robotics under the Sponsorship of Science and International Affairs Bureau, Ministry of Education, Science, Sports, and Culture, Japanese Government CO-SPONSORED BY Santa Fe Institute (SFI, USA) The Society of Instrument and Control Engineers (SICE, JAPAN) The Robotics Society of Japan (RSJ, JAPAN) The Institute of Electrical Engineers of Japan (IEEJ, JAPAN) COOPERATED BY ISICE IEICE IEEE Tokyo Section JARA SUPPORTED BY OITA Prefecture Government, etc. HONORARY PRESIDENT M.Hiramatsu (Governor, Oita Prefecture, Japan) ADVISORY COMMITTE CHAIRMAN M.Ito (Director, RIKEN, Japan) GENERAL CHAIRMAN M.Sugisaka (Oita University, Japan) VICE CHAIRMAN J.L.Casti (Santa Fe Institute, USA) PROGRAM CHAIRMAN H.Tanaka (Tokyo Medical & Dental University, Japan) ADVISORY COMMITTEE M.Ito (Director, RIKEN, Japan) (Chairman) H.Kimura (The University of Tokyo, Japan) S.Fujimura (The University of Tokyo, Japan) S.Ueno (Kyoto University, Japan) INTERNATIONAL ORGANIZING COMMITTEE W.B.Arthur (Santa Fe Institute, USA) W.Banzhaf (University of Dortmund, Germany) C.Barrett (Los Alamos National Laboratory, USA) J.P.Crutchfield (Santa Fe Institute, USA) J.L.Casti (Santa Fe Institute, USA) (Vice Chairman) J.M.Epstein (Santa Fe Institute, USA) T.Fukuda (Nagoya University, Japan) D.J.G.James (Coventry University, UK) S.Kauffman (Santa Fe Institute, USA) C.G.Langton (Santa Fe Institute, USA) R.G.Palmer (Santa Fe Institute, USA) S.Rasmussen (Santa Fe Institute, USA) T.S.Ray (Santa Fe Institute, USA) P.Schuster (Santa Fe Institute, USA) M.Sugisaka (Oita University, Japan) (Chairman) H.Tanaka (Tokyo Medical & Dental University, Japan) C.Taylor (University of California-Los Angeles, USA) W.R.Wells (University of Nevada-Las Vegas, USA) Y.G.Zhang (Academia Sinica, China) INTERNATIONAL STEERING COMMITTEE Z.Bubunicki (Wroclaw University of Technology, Poland) J.L.Casti (Santa Fe Institute, USA) (Co-Chairman) S.Fujimura (The University of Tokyo) D.J.G.James (Coventry University, UK) J.J.Lee (KAIST, Korea) G.Matsumoto (RIKEN, Japan) M.Nakamura (Saga University, Japan) S.Rasmussen (Santa Fe Institute, USA) T.S.Ray (Santa Fe Institute, USA) M.Sugisaka (Oita University, Japan) (Chairman) H.Tanaka (Tokyo Medical & Dental University, Japan) C.Taylor (University of California-Los Angeles, USA) W.R.Wells (University of Nevada-Las Vegas, USA) Y.G.Zhang (Academia Sinica, China) INTERNATIONAL PROGRAM COMMITTEE K.Abe (Tohoku University, Japan) K.Aihara (The University of Tokyo, Japan) (Co-Chairman) Z.Bubunicki (Wroclaw University of Technology, Poland) T.Christaller (CMD-German National Research Center for Information Technology, Germany) T.Fujii (RIKEN, Japan) M.Gen (Ashikaga Institute of Technology, Japan) T.Gomi (AAI, Canada) I.Harvey (University of Sussex, UK) H.Hashimoto (The University of Tokyo, Japan) (Co-Chairman) P.Husbands (University of Sussex, UK) J.Johnson (The Open University, UK) Y.Kakazu (Hokkaidou University, Japan) R.E.Kalaba (University of Southern California, USA) H.Kashiwagi (Kumamoto University, Japan) O.Katai (Kyoto University, JAPAN) S.Kawata (Tokyo Metropolitan Univsersity, Japan) J.H.Kim (KAIST, Korea) S.Kitamura (Kobe University, Japan) H.Kitano (Sony Computer Science Laboratory Inc., Japan) T.Kitazoe (Miyazaki University, Japan) S.Kumagai (Osaka University, Japan) H.H.Lund (University of Aarhus, Denmark) M.Nakamura (Saga University, Japan) R.Nakatsu (ATR, Japan) H.H.Natsuyama (Advanced Industrial Materials, USA) T.Omori (Tokyo University of Agriculture & Technology, Japan) R.Pfeifer (University of Zurich-Irchel, SwitZerland) T.S.Ray (Santa Fe Institute, USA) (Co- Chairman) T.Sawaragi (Kyoto University, Japan) T.Shibata (MITI, MEL, Japan) K.Shimohara (ATR, Japan) L.Steels (VUB AI Laboratory, Belgium) M.Sugisaka (Oita University, Japan) S.Tamura (Osaka University, Japan) H.Tanaka (Tokyo Medical & Dental University, Japan) (Chairman) N.Tosa (ATR, Japan) K.Ueda (Kobe University, Japan) A.P.Wang (Arizona State University, USA) K.Watanabe (Saga University, Japan) X.Yao (The University of New South Wales, Australia) LOCAL ARRANGEMENT COMMITTEE T.Hano (Oita University, Japan) T.Ito (Oita University, Japan) E.Kusayanagi (Oita University, Japan) T.Matsuo (Oita University, Japan) Y.Morita (Oita University, Japan) K.Nakano (University of Electro-Communications, Japan) K.Okazaki (Fukui University, Japan) S.Sato (Director, Research and Development Center, Oita University, Japan) K.Shigemitsu (Oita Industrial Research Institute, Japan) M.Sugisaka (Oita University, Japan) H.Tsukune (Director, Oita Industrial Research Institute, Japan) K.Yoshida (Oita University, Japan) X.Wang (Oita Institute of Technology, Japan) Y.Suzuki (Tokyo Medical & Dental University) From biehl at physik.uni-wuerzburg.de Wed Jul 28 06:31:36 1999 From: biehl at physik.uni-wuerzburg.de (Michael Biehl) Date: Wed, 28 Jul 1999 12:31:36 +0200 (METDST) Subject: preprint: noisy regression and classification Message-ID: <199907281031.MAA00326@wptx38.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/1999/WUE-ITP-99-015.ps.gz The following manuscript is now available via anonymous ftp, see below for the retrieval procedure. More conveniently, it can be obtained from the Wuerzburg Theoretical Physics preprint server in the WWW: http://theorie.physik.uni-wuerzburg.de/‾publications.shtml or from M. Biehl's personal homepage http://theorie.physik.uni-wuerzburg.de/‾biehl ------------------------------------------------------------- Ref. WUE-ITP-99-015 Noisy regression and classification with continuous mulitlayer networks M. Ahr, M. Biehl, and R. Urbanczik ABSTRACT We investigate zero temperature Gibbs learning for two classes of unrealizable rules which play an important role in practical applications of multilayer neural networks with differentiable activation functions: classification problems and noisy regression problems. Considering one step of replica symmetry breaking, we surprisingly find, that for sufficiently large training sets the stable state is replica symmetric even though the target rule is unrealizable. The common practice to approximate a classifiction scheme by a continuous regression problem is demonstrated to increase the number of examples needed for succesful training drastically. ------------------------------------------------------------------- Retrieval procedure via anonymous ftp: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1999 ftp> binary ftp> get WUE-ITP-99-015-ps.gz (*) ftp> quit unix> gunzip WUE-ITP-98-015.ps.gz e.g. unix> lp -odouble WUE-ITP-99-015.ps (*) can be replaced by "get WUE-ITP-99-015.ps". The file will then be uncompressed before transmission (slow!). ___________________________________________________________________ -- Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de www: http://theorie.physik.uni-wuerzburg.de/‾biehl Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From wolfskil at MIT.EDU Thu Jul 29 14:53:46 1999 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 29 Jul 1999 14:53:46 -0400 Subject: book announcement Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 1994 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/d34ba71b/attachment-0001.bin From M.Orr at anc.ed.ac.uk Thu Jul 29 09:31:10 1999 From: M.Orr at anc.ed.ac.uk (Mark Orr) Date: Thu, 29 Jul 1999 14:31:10 +0100 (BST) Subject: Matlab RBF package Message-ID: <199907291331.OAA21043@bimbo.cns.ed.ac.uk> Dear Connectionists, A updated version of my Matlab software package for RBF networks is available at http://www.anc.ed.ac.uk/‾mjo/rbf.html The methods based on ridge regression and forward selection have been improved with the addition of a mechanism for adapting the basis function widths and there are two new methods based on Kubat's idea of using regression trees to set the centres and widths. Mark Orr mark at anc.ed.ac.uk