From krichmar at nsi.edu Fri Aug 1 17:48:45 2003 From: krichmar at nsi.edu (Jeff Krichmar) Date: Fri, 1 Aug 2003 14:48:45 -0700 Subject: Postdoctoral Position in Machine Psychology and Brain-Based Devices Message-ID: <000301c35876$b028b970$6fb985c6@DHYSPR11> Please post and circulate as you see fit. Thank you. POSTDOCTORAL FELLOWSHIP The Neurosciences Institute, located in San Diego, California, invites applications for a POSTDOCTORAL FELLOWSHIP to study biologically based models of the nervous system using behaving brain-based devices or robots. To extend previous research conducted at the Institute, the Fellow will focus on the construction of autonomous brain-based devices, on the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in one or more of the following disciplines: computational neuroscience, robotics, computer science, behavioral science, or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three references to: Dr. Jeffrey L. Krichmar The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Email: krichmar at nsi.edu Fax: 858-626-2099 For a description of the project, refer to http://www.nsi.edu/nomad/ or Krichmar and Edelman, (2002) "Machine Psychology: Autonomous Behavior, Perceptual Categorization and Conditioning in a Brain-Based Device", Cerebral Cortex 12:818-830, http://www.nsi.edu/nomad/jlk_gme_cereb_cortex_2002.pdf. For a description of The Neurosciences Institute, refer to http://www.nsi.edu. From a.van.ooyen at nih.knaw.nl Tue Aug 5 07:39:33 2003 From: a.van.ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Tue, 05 Aug 2003 13:39:33 +0200 Subject: New Book: Modeling Neural Development Message-ID: <3F2F9775.5060902@nih.knaw.nl> Modeling Neural Development Edited by Arjen van Ooyen The MIT Press, Cambridge, Massachusetts, 2003 This is one of the first books to study neural development using computational and mathematical modeling. Most neural modeling focuses on information processing in the adult nervous system; Modeling Neural Development shows how models can be used to study the development of the nervous system at different levels of organization and at different phases of development, from molecule to system and from neurulation to cognition. The book's fourteen chapters follow loosely the chronology of neural development. Chapters 1 and 2 study the very early development of the nervous system, discussing gene networks, cell differentiation, and neural tube development. Chapters 3-5 examine neuronal morphogenesis and neurite outgrowth. Chapters 6-8 study different aspects of the self-organization of neurons into networks. Chapters 9-12 cover refinement of connectivity and the development of specific connectivity patterns. Chapters 13 and 14 focus on some of the functional implications of morphology and development. For more information, go to http://www.anc.ed.ac.uk/~arjen/papers/ModelingNeuralDevelopment.html -- Dr. Arjen van Ooyen Netherlands Institute for Brain Research Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands Email: A.van.Ooyen at nih.knaw.nl Website: http://www.anc.ed.ac.uk/~arjen Phone: +31.20.5665483 Fax: +31.20.6961006 From shantanu at jhu.edu Tue Aug 5 20:57:43 2003 From: shantanu at jhu.edu (SHANTANU CHAKRABARTTY) Date: Tue, 05 Aug 2003 20:57:43 -0400 Subject: GiniSVM Toolkit v1.2 Available For Download Message-ID: The latest version 1.2 of GiniSVM toolkit can be downloaded from http://bach.ece.jhu.edu/svm/ginisvm * The toolkit has additional heuristics to speed up training when there are large number of classes. * Dynamic Time Warping Kernel has been added for simple tranducer based experiments. GiniSVM is a multi-class probabilistic regression machine based on support vector machines that generates conditional probability estimates as a solution to its training. GiniSVM probabilities can directly be used in its approximate form in a logistic model for other higher-end models. ------------------------------------------- Shantanu Chakrabartty Center for Language and Speech Processing The Johns Hopkins University, Baltimore, MD - 21218, USA email: shantanu at jhu.edu Phone: 1-410-516-7701 Fax: 1-410-516-5566 -------------------------------------------- From terry at salk.edu Tue Aug 5 15:40:49 2003 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 5 Aug 2003 12:40:49 -0700 (PDT) Subject: NEURAL COMPUTATION 15:9 Message-ID: <200308051940.h75JenY17542@purkinje.salk.edu> Neural Computation - Contents - Volume 15, Number 9 - September 1, 2003 VIEW Have Brain Dynamics Evolved? Should We Look for Unique Dynamics in the Sapient Species? Thoedore Holmes Bullock LETTERS Is There Something Out There? Inferring Space from Sensorimotor Dependencies D. Philipona, J. K. O'Regan, and J.-P. Nadal A Developmental Approach Aids Motor Learning Volodymyr Ivanchenko and Robert Jacobs Cell Responsiveness in Macaque Superior Temporal Polysensory Area Measured by Temporal Discriminants J. A. Turner, K. C. Anderson, R. M. Siegel Local Interactions in Neural Networks Explain Global Effects in Gestalt Processing and Masking Michael H. Herzog, Udo A. Ernst, Axel Etzold and Christian W. Eurich. Computing with Populations of Monotonically Tuned Neurons Emmanuel Guigon A Simple and Stable Numerical Solution for the Population Density Equation M. de Kamps Slow Feature Analysis: A Theoretical Analysis of Optimal Free Responses Laurenz Wiskott Synchrony of Fast-Spiking Interneurons Interconnected by GABAergic and Electrical Synapses Masaki Nomura, Tomoki Fukai and Toshio Aoyagi Activation Functions Defined on Higher Dimensional Spaces for Approximation on Compact Sets With and Without Scaling Yoshifusa Ito Bayesian Trigonometric Support Vector Classifier Wei Chu, S. Sathiya Keerthi and Chong Jin Ong ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cindy at cns.bu.edu Tue Aug 5 13:56:14 2003 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Tue, 5 Aug 2003 13:56:14 -0400 Subject: Neural Networks 16(7) Message-ID: <035701c35b7a$dd84c6f0$573dc580@bu.edu> NEURAL NETWORKS 16(7) Contents - Volume 16, Number 7 - 2003 ------------------------------------------------------------------ *** NEURAL NETWORKS LETTERS *** Two-level hierarchy with sparsely and temporally coded patterns and its possible functional role in information processing Masaki Nomura, Toshio Aoyagi, and Masato Okada *** NEUROSCIENCE AND NEUROPSYCHOLOGY *** Incremental training of first order recurrent neural networks to predict a context-sensitive language Stephan K. Chalup and Alan D. Blair A model of dopamine modulated cortical activation F. Gregory Ashby and Michael B. Casale *** MATHEMATICAL AND COMPUTATIONAL ANALYSIS *** Inter-module credit assignment in modular reinforcement learning Kazuyuki Samejima, Kenji Doya, and Mitsuo Kawato Bounds on the number of hidden neurons in three-layer binary neural networks Zhaozhi Zhang, Xiaomin Ma, and Yixian Yang A new algorithm for online structure and parameter adaptation of RBF networks Alex Alexandridis, Haralambos Sarimveis, and George Bafas Relaxed conditions for radial-basis function networks to be universal approximators Yi Liao, Shu-Cherng Fang, and Henry L.W. Nuttle Singularities in mixture models and upper bounds of stochastic complexity Keisuki Yamazaki and Sumio Watanabe Study of distributed learning as a solution to category proliferation in Fuzzy ARTMAP based neural systems Emilio Parrado-Hernandez, Eduardo Gomez-Sanchez, and Yannis A. Dimitriadis *** TECHNOLOGY AND APPLICATIONS *** Converting general nonlinear programming problems into separable programming problems with feedforward neural networks Bao-Liang Lu and Koji Ito ARTMAP neural networks for information fusion and data mining: Map production and target recognition methodologies Olga Parsons and Gail A. Carpenter *** CURRENT EVENTS *** ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ------------------------------------------------------------------------ ---- Membership Type INNS ENNS JNNS ------------------------------------------------------------------------ ---- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ------------------------------------------------------------------------ ---- membership without $30 SEK 200 not available Neural Networks to non-students (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From Domenico.Perrotta at cec.eu.int Wed Aug 6 05:51:16 2003 From: Domenico.Perrotta at cec.eu.int (Domenico.Perrotta@cec.eu.int) Date: Wed, 6 Aug 2003 11:51:16 +0200 Subject: Call for Cognitive Systems proposals Message-ID: On 17 June 2003 the European Commission has launched a call for proposals which is addressing, among other research areas, Cognitive Systems. The deadline for those wishing to apply is 15 October 2003. The indicative pre-allocated budget is of 25 MEuro. Links to the official texts and the rules for applying can be found in the web site http://fp6.cordis.lu/fp6/call_details.cfm?CALL_ID=74. You can notify the intention to submit a proposal in the site http://www.cordis.lu/fp6/pre_registration.htm IST-2002-2.3.2.4 - Cognitive systems Objective: To construct physically instantiated or embodied systems that can perceive, understand (the semantics of information conveyed through their perceptual input) and interact with their environment, and evolve in order to achieve human-like performance in activities requiring context-(situation and task) specific knowledge. Focus is on: - Methodologies and construction of robust and adaptive cognitive systems integrating perception, reasoning, representation and learning, that are capable of interpretation, physical interaction and communication in real-world environments for the purpose of performing goal-directed tasks. Research will aim at realising complete systems with real-time performance and/or bounded rationality, have well developed memory capacities (e.g. short term, long term, iconic, associative) with efficient representation, and that acquire representations as needed to realise performance goals. The emphasis is on closing the loop in realistic test cases. A main target of this research is interdisciplinarity, i.e., to carefully consider the integration of different disciplines including computer vision, natural language understanding, robotics, artificial intelligence, mathematics and cognitive neuroscience and its impact on overall system design. Integrated Projects are expected to leverage these communities to integrate methods and insights towards the objective of realising entire systems and to promote community building. NoEs will provide a channel for fostering foundational research, for developing and maintaining common resources, specifically, of open systems and training environments to study learning and evolving systems. For specific information on the call, please write to domenico.perrotta at cec.eu.int tel: +352 4301 38257 Administrative contact person: Adriana Bini adriana.bini at cec.eu.int Tel: +352 4301 33528 From bio-adit2004-NOSPAM at listes.epfl.ch Fri Aug 8 02:03:56 2003 From: bio-adit2004-NOSPAM at listes.epfl.ch (Bio-ADIT2004) Date: Fri, 8 Aug 2003 08:03:56 +0200 Subject: [Bio-ADIT2004] - Second Call for Papers Message-ID: <18734A83-C966-11D7-9017-000A95945D40@listes.epfl.ch> ================================================================ We apologize if you receive multiple copies of this email. Please distribute this announcement to all interested parties. ================================================================ Bio-ADIT 2004 SECOND CALL FOR PAPERS The First International Workshop on Biologically Inspired Approaches to Advanced Information Technology January 29 - 30, 2004 Swiss Federal Institute of Technology, Lausanne, Switzerland Website: http://lslwww.epfl.ch/bio-adit2004/ Sponsored by - Osaka University Forum, - Swiss Federal Institute of Technology, Lausanne, and - The 21st Century Center of Excellence Program of The Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under the Program Title "Opening Up New Information Technologies for Building Networked Symbiosis Environment Biologically inspired approaches have already proved successful in achieving major breakthroughs in a wide variety of problems in information technology (IT). A more recent trend is to explore the applicability of bio-inspired approaches to the development of self-organizing, evolving, adaptive and autonomous information technologies, which will meet the requirements of next-generation information systems, such as diversity, scalability, robustness, and resilience. These new technologies will become a base on which to build a networked symbiotic environment for pleasant, symbiotic society of human beings in the 21st century. Bio-ADIT 2004 will be the first international workshop to present original research results in the field of bio-inspired approaches to advanced information technologies. It will also serve to foster the connection between biological paradigms and solutions to building the next-generation information systems. SCOPE: The primary focus of the workshop is on new and original research results in the areas of information systems inspired by biology. We invite you to submit papers that present novel, challenging, and innovative results. The topics include all aspects of bio-inspired information technologies in networks, distributed/ parallel systems, hardware (including robotics) and software. We also encourage you to submit papers dealing with: - Self-organizing, self-repairing, self-replicating and self-stabilizing systems - Evolving and adapting systems - Autonomous and evolutionary software and robotic systems - Scalable, robust and resilient systems - Complex biosystems - Gene, protein and metabolic networks - Symbiosis networks SUBMISSION OF PAPERS: Authors are invited to submit complete and original papers. Papers submitted should not have been previously published in any forum, nor be under review for any journal or other conference. All submitted papers will be refereed for quality, correctness, originality and relevance. All accepted papers will be published in the conference proceedings. It is also planned to publish accepted papers as a book. Manuscripts should include an abstract and be limited to 16 pages in single spaced and single column format. Submissions should include the title, author(s), author's affiliation, e-mail address, fax number and postal address. In the case of multiple authors, an indication of which author is responsible for correspondence and preparing the camera ready paper for the proceedings should also be included. Electronic submission is strongly encouraged. Preferred file formats are PDF (.pdf) or Postscript (.ps). Visit our website at http://lslwww.epfl.ch/bio-adit2004/ for more information. Please contact Dr. Murata if you have to submit hard copies. Manuscripts should be submitted by September 5, 2003 through the Bio-ADIT website. Please contact the technical program co-chairs for any questions: Professor Auke Jan Ijspeert School of Computer and Communication Sciences Swiss Federal Institute of Technology (EPFL) Lausanne CH 1015 Lausanne, Switzerland Tel: +41-21-693-2658 Fax: +41-21-693-3705 Email: Auke.Ijspeert at epfl.ch Professor Masayuki Murata Cybermedia Center Osaka University Toyonaka, Osaka 560-0043, Japan Tel: +81-6-6850-6860 Fax: +81-6-6850-6868 E-mail: murata at cmc.osaka-u.ac.jp IMPORTANT DATES: Paper submission deadline : September 5, 2003 Notification of acceptance: November 3, 2003 Camera ready papers due : December 1, 2003 STUDENT TRAVEL GRANTS: A limited number of travel grants will be provided for students attending Bio-ADIT 2004. Details of how to apply for a student travel grant will be posted on the workshop website. WEBSITE: An electronic paper submission system is up and ready from July 1, 2003 to accept papers for Bio-ADIT. Please visit our website at http://lslwww.epfl.ch/bio-adit2004/ for more up-to-date information. EXECUTIVE COMMITTEE: General Co-Chairs: - Daniel Mange (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Shojiro Nishio (Osaka University, Japan) Technical Program Committee Co-Chairs: - Auke Jan Ijspeert (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Masayuki Murata (Osaka University, Japan) Finance Chair: - Marlyse Taric (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Toshimitsu Masuzawa (Osaka University, Japan) Publicity Chair: - Christof Teuscher (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Takao Onoye (Osaka University, Japan) Publications Chair: - Naoki Wakamiya (Osaka University, Japan) Local Arrangements Chair: - Carlos Andres Pena-Reyes (Swiss Federal Institute of Technology, Lausanne, Switzerland) Internet Chair: - Jonas Buchli (Swiss Federal Institute of Technology, Lausanne, Switzerland) TECHNICAL PROGRAM COMMITTEE: Co-Chairs: - Auke Jan Ijspeert (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Masayuki Murata (Osaka University, Japan) Members: - Michael A. Arbib (University of Southern California, Los Angeles, USA) - Aude Billard (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Takeshi Fukuda (IBM Tokyo Research Laboratory, Japan) - Katsuo Inoue (Osaka University, Japan) - Wolfgang Maass (Graz University of Technology, Austria) - Ian W. Marshall (BTexact Technologies, UK) - Toshimitsu Masuzawa (Osaka University, Japan) - Alberto Montresor (University of Bologna, Italy) - Stefano Nolfi (Institute of Cognitive Sciences and Technology,CNR, Rome, Italy) - Takao Onoye (Osaka University, Japan) - Rolf Pfeifer (University of Zurich, Switzerland) - Eduardo Sanchez (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Hiroshi Shimizu (Osaka University, Japan) - Moshe Sipper (Ben-Gurion University, Israel) - Gregory Stephanopoulos (Massachusetts Institute of Technology, USA) - Adrian Stoica (Jet Propulsion Laboratory, USA) - Gianluca Tempesti (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Naoki Wakamiya (Osaka University, Japan) - Hans V. Westerhoff (Vrije Universiteit Amsterdam, NL) - Xin Yao (University of Birmingham, UK) From steve_kemp at unc.edu Sat Aug 16 22:08:22 2003 From: steve_kemp at unc.edu (Steven M. Kemp) Date: Sat, 16 Aug 2003 22:08:22 -0400 Subject: Announcing InSitu WebSite Message-ID: *** PLEASE FORWARD THIS TO ANY INTERESTED PERSONS OR LISTS. THANKS. *** *** (with apologies for duplicate postings) *** Dear Connectionists: The InSitu Testing Group is delighted to announce our new WebSite: http://www.InSituTestbed.org The InSitu Testing Group is an open-source community dedicated to the evaluation of computational theories of psychology and behavior against empirical data. The computational theories we hope to test include neural networks, computational learning theories, reinforcement learning theories, and cognitive theories of all kinds. Key to these evaluations is the distribution of the InSitu testbed, a software system that allows the user to test the algorithm for any given computational theory against a specific task, and generate datasets that can be compared to datasets from experiments or field observations. The InSitu Introductory Guide describes the InSitu testbed and the opportunities to work with the group. It is geared towards a diverse, general audience and should be able to provide a good picture of the project to most readers, regardless of background. A pdf version of the Guide is now available and can be downloaded at: http://www.insitutestbed.org/downloads.html#GeneralDocumentation The Group hopes to have Beta versions of the testbed available to users by year's end. Please check out the Introductory Guide and the WebSite and contact us if you are interested in any of our work. We are also interested in exchanging links with any related WebSites. Thanks, steve kemp -- >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< Steven M. Kemp | Department of Psychology | email: steve_kemp at unc.edu Davie Hall, CB# 3270 | University of North Carolina | Chapel Hill, NC 27599-3270 | fax: (919) 962-2537 Visit our WebSite at: http://www.unc.edu/~skemp/ >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< The laws of mind [are] themselves of so fluid a character as to simulate divergences from law. -- C. S. Peirce (Collected Papers, 6.101). From mimura at kobe-kosen.ac.jp Sun Aug 17 23:57:20 2003 From: mimura at kobe-kosen.ac.jp (Kazushi Mimura) Date: Mon, 18 Aug 2003 12:57:20 +0900 Subject: paper available: synapse efficiency diverges due to pruning Message-ID: <200308180357.AA00220@GOLD01.kobe-kosen.ac.jp> Apologies for multiple postings. Dear, colleagues. I would like to announce the following paper available on the web site: http://www.kobe-kosen.ac.jp/~mimura/paper/pre2003.pdf ( or cond-mat/0207545 http://arxiv.org/abs/cond-mat/0207545 ) 'Synapse efficiency diverges due to synaptic pruning following over-growth.' by Mimura,K., Kimoto,T. & Okada,M. Physical Review E (in press). Abstract-------------------------------------------------------- In the development of the brain, it is known that synapses are pruned following over-growth. This pruning following over-growth seems to be a universal phenomenon that occurs in almost all areas -- visual cortex, motor area, association area, and so on. It has been shown numerically that the synapse efficiency is increased by systematic deletion. We discuss the synapse efficiency to evaluate the effect of pruning following over-growth, and analytically show that the synapse efficiency diverges as O(log c) at the limit where connecting rate c is extremely small. Under a fixed synapse number criterion, the optimal connecting rate, which maximize memory performance, exists. ---------------------------------------------------------------- Sincerely Yours, Kazushi Mimura Dept. of Electrical Engineering, Kobe City College of Technology phone: +81-78-795-3236 (direct line) fax: +81-78-795-3314 e-mail: mimura at kobe-kosen.ac.jp url: http://www.kobe-kosen.ac.jp/~mimura/ ---- Kazushi Mimura, Kobe-cct +81-78-795-3236 (direct) From beal at cs.toronto.edu Mon Aug 18 12:52:05 2003 From: beal at cs.toronto.edu (Matthew Beal) Date: Mon, 18 Aug 2003 12:52:05 -0400 Subject: PhD thesis available on Variational Bayes Message-ID: Dear Connectionists I would like to announce my thesis, some companion Matlab software, and a website dedicated to Variational Bayesian techniques. o My thesis "Variational Methods for Approximate Bayesian Inference" is available from http://www.cs.toronto.edu/~beal/papers.html o Software for VB Mixtures of Factor Analysers, VB Hidden Markov Models, and VB State Space Models (Linear Dynamical Systems) is available from http://www.cs.toronto.edu/~beal/software.html o Variational-Bayes.org: a repository of papers, software, and links related to the use of variational methods for approximate Bayesian learning http://www.variational-bayes.org We welcome your feed-back to help build this site. Below is an abstract and short contents of my thesis. Cheers -Matt ---------- Abstract: The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms. Chapter 2 forms the theoretical core of the thesis, generalising the expectation-maximisation (EM) algorithm for learning maximum likelihood parameters to the VB EM algorithm which integrates over model parameters. The algorithm is then specialised to the large family of conjugate-exponential (CE) graphical models, and several theorems are presented to pave the road for automated VB derivation procedures in both directed and undirected graphs (Bayesian and Markov networks, respectively). Chapters 3-5 derive and apply the VB EM algorithm to three commonly-used and important models: mixtures of factor analysers, linear dynamical systems, and hidden Markov models. It is shown how model selection tasks such as determining the dimensionality, cardinality, or number of variables are possible using VB approximations. Also explored are methods for combining sampling procedures with variational approximations, to estimate the tightness of VB bounds and to obtain more effective sampling algorithms. Chapter 6 applies VB learning to a long-standing problem of scoring discrete-variable directed acyclic graphs, and compares the performance to annealed importance sampling amongst other methods. Throughout, the VB approximation is compared to other methods including sampling, Cheeseman-Stutz, and asymptotic approximations such as BIC. The thesis concludes with a discussion of evolving directions for model selection including infinite models and alternative approximations to the marginal likelihood. ---------- Table of Contents: Chapter 1 Introduction 1.1 Probabilistic inference 1.2 Bayesian model selection 1.3 Practical Bayesian approaches 1.4 Summary of the remaining chapters Chapter 2 Variational Bayesian Theory 2.1 Introduction 2.2 Variational methods for ML / MAP learning 2.3 Variational methods for Bayesian learning 2.4 Conjugate-Exponential models 2.5 Directed and undirected graphs 2.6 Comparisons of VB to other criteria 2.7 Summary Chapter 3 Variational Bayesian Hidden Markov Models 3.1 Introduction 3.2 Inference and learning for maximum likelihood HMMs 3.3 Bayesian HMMs 3.4 Variational Bayesian formulation 3.5 Experiments 3.6 Discussion Chapter 4 Variational Bayesian Mixtures of Factor Analysers 4.1 Introduction 4.2 Bayesian Mixture of Factor Analysers 4.3 Model exploration: birth and death 4.4 Handling the predictive density 4.5 Synthetic experiments 4.6 Digit experiments 4.7 Combining VB approximations with Monte Carlo 4.8 Summary Chapter 5 Variational Bayesian Linear Dynamical Systems 5.1 Introduction 5.2 The Linear Dynamical System model 5.3 The variational treatment 5.4 Synthetic Experiments 5.5 Elucidating gene expression mechanisms 5.6 Possible extensions and future research 5.7 Summary Chapter 6 Learning the structure of discrete-variable graphical models with hidden variables 6.1 Introduction 6.2 Calculating marginal likelihoods of DAGs 6.3 Estimating the marginal likelihood 6.4 Experiments 6.5 Open questions and directions 6.6 Summary Chapter 7 Conclusion 7.1 Discussion 7.2 Summary of contributions Appendix A Conjugate Exponential family examples Appendix B Useful results from matrix theory B.1 Schur complements and inverting partitioned matrices B.2 The matrix inversion lemma Appendix C Miscellaneous results C.1 Computing the digamma function C.2 Multivariate gamma hyperparameter optimisation C.3 Marginal KL divergence of gamma-Gaussian variables ---------- From markman at psyvax.psy.utexas.edu Mon Aug 18 16:16:13 2003 From: markman at psyvax.psy.utexas.edu (Art Markman) Date: Mon, 18 Aug 2003 15:16:13 -0500 Subject: No subject Message-ID: John R. Anderson to Receive the David E. Rumelhart Prize for Contributions to the Formal Analysis of Human Cognition The Glushko-Samuelson Foundation and the Cognitive Science Society are pleased to announce that John R. Anderson has been chosen as the fourth recipient of the $100,000 David E. Rumelhart Prize, awarded annually for outstanding contributions to the formal analysis of human cognition. Anderson will receive this prize and give the Prize Lecture at the 26th Meeting of the Cognitive Science Society in Chicago, August 4-8, 2004. The David E. Rumelhart Prize The David E. Rumelhart Prize was created by the Glushko-Samuelson Foundation to honor David E. Rumelhart, a Cognitive Scientist who exploited a wide range of formal methods to address issues and topics in Cognitive Science. Perhaps best known for his contributions to connectionist or neural network models, Rumelhart also exploited symbolic models of human cognition, formal linguistic methods, and the formal tools of mathematics. Reflecting this diversity, the first three winners of the David E. Rumelhart Prize are individuals whose work lies within three of these four approaches. Past recipients are Geoffrey Hinton, a connectionist modeler, Richard M. Shiffrin, a mathematical psychologist, and Aravind Joshi, a formal and computational linguist. Anderson is the leading proponent of the symbolic modeling framework, thereby completing coverage of the four approaches. Research Biography of John R. Anderson John R. Anderson, Richard King Mellon Professor of Psychology and Computer Science at Carnegie Mellon University is an exemplary recipient for a prize that is intended to honor "a significant contemporary contribution to the formal analysis of human cognition". For the last three decades, Anderson has been engaged in a vigorous research program with the goal of developing a computational theory of mind. Anderson's work is framed within the symbol processing framework and has involved an integrated program of experimental work, mathematical analyses, computational modeling, and rigorous applications. His research has provided the field of cognitive psychology with comprehensive and integrated theories. Furthermore, it has had a real impact on educational practice in the classroom and on student achievement in learning mathematics. Anderson's contributions have arisen across a career that consists of five distinct phases. Phase 1 began when he entered graduate school at Stanford at a time when cognitive psychology was incorporating computational techniques from artificial intelligence. During this period and immediately after his graduation from Stanford, he developed a number of simulation models of various aspects of human cognition such as free recall [1]. His major contribution from this time was the HAM theory, which he developed with Gordon Bower. In 1973, he and Bower published the book Human Associative Memory [2], which immediately attracted the attention of everyone then working in the field. The book played a major role in establishing propositional semantic networks as the basis for representation in memory and spreading activation through the links in such networks as the basis for retrieval of information from memory. It also provided an initial example of a research style that has become increasingly used in cognitive science: to create a comprehensive computer simulation capable of performing a range of cognitive tasks and to test this model with a series of experiments addressing the phenomena within that range. Dissatisfied with the limited scope of his early theory, Anderson undertook the work which has been the major focus of his career to date, the development of the ACT theory [3]. ACT extended the HAM theory by combining production systems with semantic nets and the mechanism of spreading activation. The second phase of Anderson's career is associated with the initial development of ACT. The theory reached a significant level of maturity with the publication in 1983 of The Architecture of Cognition [4], which is the most cited of his research monographs (having received almost 2000 citations in the ensuing years). At the time of publication, The ACT* model described in this book was the most integrated model of cognition that had then been produced and tested. It has had a major impact on the theoretical development of the field and on the movement toward comprehensive and unified theories, incorporating separation of procedural and declarative knowledge and a series of mechanisms for production rule learning that became the focus of much subsequent research on the acquisition of cognitive skills. In his own book on Unified Theories of Cognition, Alan Newell had this to say: "ACT*, is in my opinion, the first unified theory of cognition. It has pride of place.... [It] provides a threshold of success which all other candidates... must exceed". Anderson then began a major program to test whether ACT* and its skill acquisition mechanisms actually provided an integrated and accurate account of learning. He started to apply the theory to development of intelligent tutoring systems; this defines the third phase of his research. This work grew from an initial emphasis on teaching the programming language LISP to a broader focus on high-school mathematics [5], responding to perceptions of a national crisis in mathematics education. These systems have been shown to enable students to reach target achievement levels in a third of the usual time and to improve student performance by a letter grade in real classrooms. Anderson guided this research to the point where a full high school curriculum was developed that was used in urban schools. Subsequently, a separate corporation has been created to place the tutor in hundreds of schools, influencing tens of thousands of students. The tutor curriculum was recently recognized by the Department of Education as one of five "exemplary curricula" nationwide. While Anderson does not participate in that company, he continues research developing better tools for tracking individual student cognition, and this research continues to be informed by the ACT theory. His tutoring systems have established that it is possible to impact education with rigorous simulation of human cognition. In the late 1980s, Anderson began work on what was to define the fourth phase of his research, which was an attempt to understand how the basic mechanisms of a cognitive architecture were adapted to the statistical structure of the environment. Anderson (1990) [6] called this a rational analysis of cognition and applied it to the domains of human memory, categorization, causal inference, and problem solving. He utilized Bayesian statistics to derive optimal solutions to the problems posed by the environment and showed that human cognition approximated these solutions. Such optimization analysis and use of Bayesian techniques have become increasingly prevalent in Cognitive Science. Subsequent to the rational analysis effort, Anderson has returned his full attention back to the ACT theory, defining the fifth and current phase of his career. With Christian Lebiere, he has developed the ACT-R theory, which incorporates the insights from his work on rational analysis [7]. Reflecting the developments in computer technology and the techniques learned in the applications of ACT*, the ACT-R system was made available for general use. A growing and very active community of well over 100 researchers is now using it to model a wide range of issues in human cognition, including dualtasking, memory, language, scientific discovery, and game playing. It has become increasingly used to model dynamic tasks like air-traffic control, where it promises to have training implications equivalent to the mathematics tutors. Through the independent work of many researchers, the field of cognitive science is now seeing a single unified system applied to an unrivaled range of tasks. Much of Anderson's own work on the ACT-R has been involved relating the theory to data from functional brain imaging [8]. In addition to his enormous volume of original work, Anderson has found the time to produce and revise two textbooks, one on cognitive psychology [9] and the other on learning and memory [10]. The cognitive psychology textbook, now in its fifth edition, helped define the course of study that is modern introductory cognitive psychology. His more recent learning and memory textbook, now in its second edition, is widely regarded as reflecting the new synthesis that is occurring in that field among animal learning, cognitive psychology, and cognitive neuroscience. Anderson has previously served as president of the Cognitive Science Society and has received a number of awards in recognition of his contributions. In 1978 he received the American Psychological Association's Early Career Award; in 1981 he was elected to membership in the Society of Experimental Psychologists; in 1994 he received APA's Distinguished Scientific Contribution Award; and in 1999 he was elected to both the National Academy of Sciences and the American Academy of Arts and Science. Currently, as a member of the National Academy, he is working towards bringing more rigorous science standards to educational research. 1. Anderson, J. R., & Bower, G. H. (1972). Recognition and retrieval processes in free recall. Psychological Review, 79, 97-123. 2. Anderson, J. R. & Bower, G. H. (1973). Human associative memory. Washington: Winston and Sons. 3. Anderson, J. R. (1976). Language, memory, and thought. Hillsdale, NJ: Erlbaum. 4. Anderson, J. R. (1983). The Architecture of Cognition. Cambridge, MA: Harvard University Press. 5. Anderson, J. R., Corbett, A. T., Koedinger, K., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of Learning Sciences, 4, 167-207. 6. Anderson, J. R. (1990). The Adaptive Character of Thought. Hillsdale, NJ: Erlbaum. 7. Anderson, J. R. & Lebiere, C. (1998). The atomic components of thought. Mahwah, NJ: Erlbaum. 8. Anderson, J. R., Qin, Y., Sohn, M-H., Stenger, V. A. & Carter, C. S. (2003.) An information-processing model of the BOLD response in symbol manipulation tasks. Psychonomic Bulletin & Review. 10, 241-261. 9. Anderson, J. R. (2000). Cognitive Psychology and Its Implications: Fifth Edition. New York: Worth Publishing. 10. Anderson, J. R. (2000). Learning and Memory, Second Edition. New York: Wiley. From calls at bbsonline.org Tue Aug 19 10:51:42 2003 From: calls at bbsonline.org (Behavioral & Brain Sciences) Date: Tue, 19 Aug 2003 15:51:42 +0100 Subject: Wegner/The Illusion of Conscious Will: BBS Multiple Book Review Message-ID: Below is a link to the forthcoming precis of a book accepted for Multiple Book Review in Behavioral and Brain Sciences (BBS). PRECIS OF: The Illusion of Conscious Will by Daniel M. Wegner PRECIS: http://www.bbsonline.org/Preprints/Wegner-05012003/Referees/ Please note that it is the *BOOK*, not the precis, that is to be reviewed. Behavioral and Brain Sciences (BBS), is an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be BBS Associates or nominated by a BBS Associate. To be considered as a reviewer for this book, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please reply by EMAIL within four (4) weeks to: calls at bbsonline.org The Calls are sent to over 10,000 BBS Associates, so there is no need to reply except if you wish to review this book, or to nominate someone to review. If you are not a BBS Associate, please approach a current BBS Associate who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. To help you decide whether you would be an appropriate reviewer for this book, an electronic draft of the precis is retrievable at the URL included in this email. ======================================================================= *** IMPORTANT *** Please do not prepare a review unless you are formally invited. To help us put together a balanced list of reviewers, it would be most helpful if you would send us as specific as possible an indication of the relevant expertise you would bring to bear on the subject, and what aspect of the book you would anticipate commenting upon. We will then let you know whether it was possible to include your name on the final formal list of invitees. NOTE: Please indicate whether you already have the book or would require a review copy to be sent to you if invited. ======================================================================= PRECIS OF: The Illusion of Conscious Will Daniel M. Wegner Department of Psychology Harvard University ABSTRACT: The experience of conscious will is the feeling that we're doing things. This feeling occurs for many things we do, conveying to us again and again the sense that we consciously cause our actions. But the feeling may not be a true reading of what is happening in our minds, brains, and bodies as our actions are produced. The feeling of conscious will can be fooled. This happens in clinical disorders such as alien hand syndrome, dissociative identity disorder, and schizophrenic auditory hallucinations. And in people without disorders, phenomena such as hypnosis, automatic writing, Ouija board spelling, water dowsing, facilitated communication, speaking in tongues, spirit possession, and trance channeling also illustrate anomalies of will-cases when actions occur without will, or will occurs without action. This book brings these cases together with research evidence from laboratories in psychology and neuroscience to explore a theory of apparent mental causation. According to this theory, when a thought appears in consciousness just prior to an action, is consistent with the action, and appears exclusive of salient alternative causes of the action, we experience conscious will and ascribe authorship to ourselves for the action. Experiences of conscious will thus arise from processes whereby the mind interprets itself-not from processes whereby mind creates action. Conscious will, in this view, is an indication that we think we have caused an action, not a revelation of the causal sequence by which the action was produced. KEYWORDS: conscious will; free will; determinism; apparent mental causation; automatism; perceived control PRECIS: http://www.bbsonline.org/Preprints/Wegner-05012003/Referees/ ====================================================================== *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password. Or, email a response with the word "remove" in the subject line. *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Thanks, Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://www.bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From baolshausen at ucdavis.edu Thu Aug 21 16:15:06 2003 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Thu, 21 Aug 2003 14:15:06 -0600 Subject: Network special issue Message-ID: <3F45284A.D6217341@ucdavis.edu> This month's issue of Network: Computation in Neural Systems is devoted to "Natural Scene Statistics and Neural Codes." It contains a partial collection of papers presented at the Gordon Research Conference on Sensory Coding in the Natural Environment held in summer 2002. The issue has been made available for free for a limited time at http://www.iop.org/EJ/journal/Network Bruno Olshausen and Pam Reinagel, guest editors P.S. Please note that the next Gordon Research Conference on Sensory Coding in the Natural Environment will be held Sept. 5-10, 2004 at The Queens College, Oxford, UK. See http://www.grc.org/ (program and registration info will be made available in January). -- Bruno A. Olshausen (530) 757-8749 Center for Neuroscience, UC Davis (530) 757-8827 (fax) 1544 Newton Ct. baolshausen at ucdavis.edu Davis, CA 95616 http://redwood.ucdavis.edu/bruno & Redwood Neuroscience Institute (650) 321-8282 x233 1010 El Camino Real, suite 380 (650) 321-8585 (fax) Menlo Park, CA 94025 http://www.rni.org From d.mareschal at bbk.ac.uk Fri Aug 22 08:10:49 2003 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Fri, 22 Aug 2003 12:10:49 +0000 Subject: special issue Message-ID: Dear all, Some of you may be interested in this recent special issue of Developmental Science comparing connectionist and dynamics systems approaches to development. Best, Denis Mareschal ------------------ Developmental Science Volume 4 Number 4 Contents Fast-Track Report Larger brains and white matter volumes in children with developmental language disorder Martha R. Herbert, David A. Ziegler, Nikos Makris, Anna Bakardjiev, James Hodgson, Kristen T. Adrien, David N. Kennedy, Pauline A. Filipek, and Verne S. Canives Jr. F11 Special Issue: Connectionist and Dynamic Systems Approaches to Development Introduction to the special issue: Why this question and why now? John P. spencer and Esther Thelen 375 Connectionism and dynamic systems: are they really different? Esther Thelen and Elizabeth Bates 378 Bridging the representational gap in dynamic systems approaches to development John P. Spencer and Gregor Schoner 392 Connectionist models of development Yuko Munakata and James L. McClelland 413 Development: it's about time Jeff Elman 430 Different is good: connectionism and dynamic systems theory are complementary emergenist approaches to development Linda B. Smith and Larissa K. Samuelson 434 Comparisons of developmental modeling frameworks and levels of analysis on cognition: connectionist and dynamic systems theories deserve attention, but don't yet explain attention Nelson Cowan 440 http://www.blackwellpublishing.com/journal.asp?ref=1363-755X -- -- ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 020 7631-6582/6207/6226 fax +44 020 7631-6312 http://www.psyc.bbk.ac.uk/people/academic/mareschal_d/ ================================================= From G.Brown at cs.bham.ac.uk Tue Aug 26 10:40:35 2003 From: G.Brown at cs.bham.ac.uk (Gavin Brown) Date: Tue, 26 Aug 2003 15:40:35 +0100 (BST) Subject: Ensemble Learning Bibliography Message-ID: Dear Connectionists, A note to let you know about a new online bibliography of ensemble learning references. Mostly focusing on neural networks, but is expanding. Currently holding over 250 entries, contributions in Bibtex format are welcomed. http://www.cs.bham.ac.uk/~gxb/ensemblebib.php Thanks, -Gavin Brown From krichmar at nsi.edu Fri Aug 1 17:48:45 2003 From: krichmar at nsi.edu (Jeff Krichmar) Date: Fri, 1 Aug 2003 14:48:45 -0700 Subject: Postdoctoral Position in Machine Psychology and Brain-Based Devices Message-ID: <000301c35876$b028b970$6fb985c6@DHYSPR11> Please post and circulate as you see fit. Thank you. POSTDOCTORAL FELLOWSHIP The Neurosciences Institute, located in San Diego, California, invites applications for a POSTDOCTORAL FELLOWSHIP to study biologically based models of the nervous system using behaving brain-based devices or robots. To extend previous research conducted at the Institute, the Fellow will focus on the construction of autonomous brain-based devices, on the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in one or more of the following disciplines: computational neuroscience, robotics, computer science, behavioral science, or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three references to: Dr. Jeffrey L. Krichmar The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Email: krichmar at nsi.edu Fax: 858-626-2099 For a description of the project, refer to http://www.nsi.edu/nomad/ or Krichmar and Edelman, (2002) "Machine Psychology: Autonomous Behavior, Perceptual Categorization and Conditioning in a Brain-Based Device", Cerebral Cortex 12:818-830, http://www.nsi.edu/nomad/jlk_gme_cereb_cortex_2002.pdf. For a description of The Neurosciences Institute, refer to http://www.nsi.edu. From a.van.ooyen at nih.knaw.nl Tue Aug 5 07:39:33 2003 From: a.van.ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Tue, 05 Aug 2003 13:39:33 +0200 Subject: New Book: Modeling Neural Development Message-ID: <3F2F9775.5060902@nih.knaw.nl> Modeling Neural Development Edited by Arjen van Ooyen The MIT Press, Cambridge, Massachusetts, 2003 This is one of the first books to study neural development using computational and mathematical modeling. Most neural modeling focuses on information processing in the adult nervous system; Modeling Neural Development shows how models can be used to study the development of the nervous system at different levels of organization and at different phases of development, from molecule to system and from neurulation to cognition. The book's fourteen chapters follow loosely the chronology of neural development. Chapters 1 and 2 study the very early development of the nervous system, discussing gene networks, cell differentiation, and neural tube development. Chapters 3-5 examine neuronal morphogenesis and neurite outgrowth. Chapters 6-8 study different aspects of the self-organization of neurons into networks. Chapters 9-12 cover refinement of connectivity and the development of specific connectivity patterns. Chapters 13 and 14 focus on some of the functional implications of morphology and development. For more information, go to http://www.anc.ed.ac.uk/~arjen/papers/ModelingNeuralDevelopment.html -- Dr. Arjen van Ooyen Netherlands Institute for Brain Research Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands Email: A.van.Ooyen at nih.knaw.nl Website: http://www.anc.ed.ac.uk/~arjen Phone: +31.20.5665483 Fax: +31.20.6961006 From shantanu at jhu.edu Tue Aug 5 20:57:43 2003 From: shantanu at jhu.edu (SHANTANU CHAKRABARTTY) Date: Tue, 05 Aug 2003 20:57:43 -0400 Subject: GiniSVM Toolkit v1.2 Available For Download Message-ID: The latest version 1.2 of GiniSVM toolkit can be downloaded from http://bach.ece.jhu.edu/svm/ginisvm * The toolkit has additional heuristics to speed up training when there are large number of classes. * Dynamic Time Warping Kernel has been added for simple tranducer based experiments. GiniSVM is a multi-class probabilistic regression machine based on support vector machines that generates conditional probability estimates as a solution to its training. GiniSVM probabilities can directly be used in its approximate form in a logistic model for other higher-end models. ------------------------------------------- Shantanu Chakrabartty Center for Language and Speech Processing The Johns Hopkins University, Baltimore, MD - 21218, USA email: shantanu at jhu.edu Phone: 1-410-516-7701 Fax: 1-410-516-5566 -------------------------------------------- From terry at salk.edu Tue Aug 5 15:40:49 2003 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 5 Aug 2003 12:40:49 -0700 (PDT) Subject: NEURAL COMPUTATION 15:9 Message-ID: <200308051940.h75JenY17542@purkinje.salk.edu> Neural Computation - Contents - Volume 15, Number 9 - September 1, 2003 VIEW Have Brain Dynamics Evolved? Should We Look for Unique Dynamics in the Sapient Species? Thoedore Holmes Bullock LETTERS Is There Something Out There? Inferring Space from Sensorimotor Dependencies D. Philipona, J. K. O'Regan, and J.-P. Nadal A Developmental Approach Aids Motor Learning Volodymyr Ivanchenko and Robert Jacobs Cell Responsiveness in Macaque Superior Temporal Polysensory Area Measured by Temporal Discriminants J. A. Turner, K. C. Anderson, R. M. Siegel Local Interactions in Neural Networks Explain Global Effects in Gestalt Processing and Masking Michael H. Herzog, Udo A. Ernst, Axel Etzold and Christian W. Eurich. Computing with Populations of Monotonically Tuned Neurons Emmanuel Guigon A Simple and Stable Numerical Solution for the Population Density Equation M. de Kamps Slow Feature Analysis: A Theoretical Analysis of Optimal Free Responses Laurenz Wiskott Synchrony of Fast-Spiking Interneurons Interconnected by GABAergic and Electrical Synapses Masaki Nomura, Tomoki Fukai and Toshio Aoyagi Activation Functions Defined on Higher Dimensional Spaces for Approximation on Compact Sets With and Without Scaling Yoshifusa Ito Bayesian Trigonometric Support Vector Classifier Wei Chu, S. Sathiya Keerthi and Chong Jin Ong ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cindy at cns.bu.edu Tue Aug 5 13:56:14 2003 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Tue, 5 Aug 2003 13:56:14 -0400 Subject: Neural Networks 16(7) Message-ID: <035701c35b7a$dd84c6f0$573dc580@bu.edu> NEURAL NETWORKS 16(7) Contents - Volume 16, Number 7 - 2003 ------------------------------------------------------------------ *** NEURAL NETWORKS LETTERS *** Two-level hierarchy with sparsely and temporally coded patterns and its possible functional role in information processing Masaki Nomura, Toshio Aoyagi, and Masato Okada *** NEUROSCIENCE AND NEUROPSYCHOLOGY *** Incremental training of first order recurrent neural networks to predict a context-sensitive language Stephan K. Chalup and Alan D. Blair A model of dopamine modulated cortical activation F. Gregory Ashby and Michael B. Casale *** MATHEMATICAL AND COMPUTATIONAL ANALYSIS *** Inter-module credit assignment in modular reinforcement learning Kazuyuki Samejima, Kenji Doya, and Mitsuo Kawato Bounds on the number of hidden neurons in three-layer binary neural networks Zhaozhi Zhang, Xiaomin Ma, and Yixian Yang A new algorithm for online structure and parameter adaptation of RBF networks Alex Alexandridis, Haralambos Sarimveis, and George Bafas Relaxed conditions for radial-basis function networks to be universal approximators Yi Liao, Shu-Cherng Fang, and Henry L.W. Nuttle Singularities in mixture models and upper bounds of stochastic complexity Keisuki Yamazaki and Sumio Watanabe Study of distributed learning as a solution to category proliferation in Fuzzy ARTMAP based neural systems Emilio Parrado-Hernandez, Eduardo Gomez-Sanchez, and Yannis A. Dimitriadis *** TECHNOLOGY AND APPLICATIONS *** Converting general nonlinear programming problems into separable programming problems with feedforward neural networks Bao-Liang Lu and Koji Ito ARTMAP neural networks for information fusion and data mining: Map production and target recognition methodologies Olga Parsons and Gail A. Carpenter *** CURRENT EVENTS *** ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ------------------------------------------------------------------------ ---- Membership Type INNS ENNS JNNS ------------------------------------------------------------------------ ---- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ------------------------------------------------------------------------ ---- membership without $30 SEK 200 not available Neural Networks to non-students (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From Domenico.Perrotta at cec.eu.int Wed Aug 6 05:51:16 2003 From: Domenico.Perrotta at cec.eu.int (Domenico.Perrotta@cec.eu.int) Date: Wed, 6 Aug 2003 11:51:16 +0200 Subject: Call for Cognitive Systems proposals Message-ID: On 17 June 2003 the European Commission has launched a call for proposals which is addressing, among other research areas, Cognitive Systems. The deadline for those wishing to apply is 15 October 2003. The indicative pre-allocated budget is of 25 MEuro. Links to the official texts and the rules for applying can be found in the web site http://fp6.cordis.lu/fp6/call_details.cfm?CALL_ID=74. You can notify the intention to submit a proposal in the site http://www.cordis.lu/fp6/pre_registration.htm IST-2002-2.3.2.4 - Cognitive systems Objective: To construct physically instantiated or embodied systems that can perceive, understand (the semantics of information conveyed through their perceptual input) and interact with their environment, and evolve in order to achieve human-like performance in activities requiring context-(situation and task) specific knowledge. Focus is on: - Methodologies and construction of robust and adaptive cognitive systems integrating perception, reasoning, representation and learning, that are capable of interpretation, physical interaction and communication in real-world environments for the purpose of performing goal-directed tasks. Research will aim at realising complete systems with real-time performance and/or bounded rationality, have well developed memory capacities (e.g. short term, long term, iconic, associative) with efficient representation, and that acquire representations as needed to realise performance goals. The emphasis is on closing the loop in realistic test cases. A main target of this research is interdisciplinarity, i.e., to carefully consider the integration of different disciplines including computer vision, natural language understanding, robotics, artificial intelligence, mathematics and cognitive neuroscience and its impact on overall system design. Integrated Projects are expected to leverage these communities to integrate methods and insights towards the objective of realising entire systems and to promote community building. NoEs will provide a channel for fostering foundational research, for developing and maintaining common resources, specifically, of open systems and training environments to study learning and evolving systems. For specific information on the call, please write to domenico.perrotta at cec.eu.int tel: +352 4301 38257 Administrative contact person: Adriana Bini adriana.bini at cec.eu.int Tel: +352 4301 33528 From bio-adit2004-NOSPAM at listes.epfl.ch Fri Aug 8 02:03:56 2003 From: bio-adit2004-NOSPAM at listes.epfl.ch (Bio-ADIT2004) Date: Fri, 8 Aug 2003 08:03:56 +0200 Subject: [Bio-ADIT2004] - Second Call for Papers Message-ID: <18734A83-C966-11D7-9017-000A95945D40@listes.epfl.ch> ================================================================ We apologize if you receive multiple copies of this email. Please distribute this announcement to all interested parties. ================================================================ Bio-ADIT 2004 SECOND CALL FOR PAPERS The First International Workshop on Biologically Inspired Approaches to Advanced Information Technology January 29 - 30, 2004 Swiss Federal Institute of Technology, Lausanne, Switzerland Website: http://lslwww.epfl.ch/bio-adit2004/ Sponsored by - Osaka University Forum, - Swiss Federal Institute of Technology, Lausanne, and - The 21st Century Center of Excellence Program of The Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under the Program Title "Opening Up New Information Technologies for Building Networked Symbiosis Environment Biologically inspired approaches have already proved successful in achieving major breakthroughs in a wide variety of problems in information technology (IT). A more recent trend is to explore the applicability of bio-inspired approaches to the development of self-organizing, evolving, adaptive and autonomous information technologies, which will meet the requirements of next-generation information systems, such as diversity, scalability, robustness, and resilience. These new technologies will become a base on which to build a networked symbiotic environment for pleasant, symbiotic society of human beings in the 21st century. Bio-ADIT 2004 will be the first international workshop to present original research results in the field of bio-inspired approaches to advanced information technologies. It will also serve to foster the connection between biological paradigms and solutions to building the next-generation information systems. SCOPE: The primary focus of the workshop is on new and original research results in the areas of information systems inspired by biology. We invite you to submit papers that present novel, challenging, and innovative results. The topics include all aspects of bio-inspired information technologies in networks, distributed/ parallel systems, hardware (including robotics) and software. We also encourage you to submit papers dealing with: - Self-organizing, self-repairing, self-replicating and self-stabilizing systems - Evolving and adapting systems - Autonomous and evolutionary software and robotic systems - Scalable, robust and resilient systems - Complex biosystems - Gene, protein and metabolic networks - Symbiosis networks SUBMISSION OF PAPERS: Authors are invited to submit complete and original papers. Papers submitted should not have been previously published in any forum, nor be under review for any journal or other conference. All submitted papers will be refereed for quality, correctness, originality and relevance. All accepted papers will be published in the conference proceedings. It is also planned to publish accepted papers as a book. Manuscripts should include an abstract and be limited to 16 pages in single spaced and single column format. Submissions should include the title, author(s), author's affiliation, e-mail address, fax number and postal address. In the case of multiple authors, an indication of which author is responsible for correspondence and preparing the camera ready paper for the proceedings should also be included. Electronic submission is strongly encouraged. Preferred file formats are PDF (.pdf) or Postscript (.ps). Visit our website at http://lslwww.epfl.ch/bio-adit2004/ for more information. Please contact Dr. Murata if you have to submit hard copies. Manuscripts should be submitted by September 5, 2003 through the Bio-ADIT website. Please contact the technical program co-chairs for any questions: Professor Auke Jan Ijspeert School of Computer and Communication Sciences Swiss Federal Institute of Technology (EPFL) Lausanne CH 1015 Lausanne, Switzerland Tel: +41-21-693-2658 Fax: +41-21-693-3705 Email: Auke.Ijspeert at epfl.ch Professor Masayuki Murata Cybermedia Center Osaka University Toyonaka, Osaka 560-0043, Japan Tel: +81-6-6850-6860 Fax: +81-6-6850-6868 E-mail: murata at cmc.osaka-u.ac.jp IMPORTANT DATES: Paper submission deadline : September 5, 2003 Notification of acceptance: November 3, 2003 Camera ready papers due : December 1, 2003 STUDENT TRAVEL GRANTS: A limited number of travel grants will be provided for students attending Bio-ADIT 2004. Details of how to apply for a student travel grant will be posted on the workshop website. WEBSITE: An electronic paper submission system is up and ready from July 1, 2003 to accept papers for Bio-ADIT. Please visit our website at http://lslwww.epfl.ch/bio-adit2004/ for more up-to-date information. EXECUTIVE COMMITTEE: General Co-Chairs: - Daniel Mange (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Shojiro Nishio (Osaka University, Japan) Technical Program Committee Co-Chairs: - Auke Jan Ijspeert (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Masayuki Murata (Osaka University, Japan) Finance Chair: - Marlyse Taric (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Toshimitsu Masuzawa (Osaka University, Japan) Publicity Chair: - Christof Teuscher (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Takao Onoye (Osaka University, Japan) Publications Chair: - Naoki Wakamiya (Osaka University, Japan) Local Arrangements Chair: - Carlos Andres Pena-Reyes (Swiss Federal Institute of Technology, Lausanne, Switzerland) Internet Chair: - Jonas Buchli (Swiss Federal Institute of Technology, Lausanne, Switzerland) TECHNICAL PROGRAM COMMITTEE: Co-Chairs: - Auke Jan Ijspeert (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Masayuki Murata (Osaka University, Japan) Members: - Michael A. Arbib (University of Southern California, Los Angeles, USA) - Aude Billard (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Takeshi Fukuda (IBM Tokyo Research Laboratory, Japan) - Katsuo Inoue (Osaka University, Japan) - Wolfgang Maass (Graz University of Technology, Austria) - Ian W. Marshall (BTexact Technologies, UK) - Toshimitsu Masuzawa (Osaka University, Japan) - Alberto Montresor (University of Bologna, Italy) - Stefano Nolfi (Institute of Cognitive Sciences and Technology,CNR, Rome, Italy) - Takao Onoye (Osaka University, Japan) - Rolf Pfeifer (University of Zurich, Switzerland) - Eduardo Sanchez (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Hiroshi Shimizu (Osaka University, Japan) - Moshe Sipper (Ben-Gurion University, Israel) - Gregory Stephanopoulos (Massachusetts Institute of Technology, USA) - Adrian Stoica (Jet Propulsion Laboratory, USA) - Gianluca Tempesti (Swiss Federal Institute of Technology, Lausanne, Switzerland) - Naoki Wakamiya (Osaka University, Japan) - Hans V. Westerhoff (Vrije Universiteit Amsterdam, NL) - Xin Yao (University of Birmingham, UK) From steve_kemp at unc.edu Sat Aug 16 22:08:22 2003 From: steve_kemp at unc.edu (Steven M. Kemp) Date: Sat, 16 Aug 2003 22:08:22 -0400 Subject: Announcing InSitu WebSite Message-ID: *** PLEASE FORWARD THIS TO ANY INTERESTED PERSONS OR LISTS. THANKS. *** *** (with apologies for duplicate postings) *** Dear Connectionists: The InSitu Testing Group is delighted to announce our new WebSite: http://www.InSituTestbed.org The InSitu Testing Group is an open-source community dedicated to the evaluation of computational theories of psychology and behavior against empirical data. The computational theories we hope to test include neural networks, computational learning theories, reinforcement learning theories, and cognitive theories of all kinds. Key to these evaluations is the distribution of the InSitu testbed, a software system that allows the user to test the algorithm for any given computational theory against a specific task, and generate datasets that can be compared to datasets from experiments or field observations. The InSitu Introductory Guide describes the InSitu testbed and the opportunities to work with the group. It is geared towards a diverse, general audience and should be able to provide a good picture of the project to most readers, regardless of background. A pdf version of the Guide is now available and can be downloaded at: http://www.insitutestbed.org/downloads.html#GeneralDocumentation The Group hopes to have Beta versions of the testbed available to users by year's end. Please check out the Introductory Guide and the WebSite and contact us if you are interested in any of our work. We are also interested in exchanging links with any related WebSites. Thanks, steve kemp -- >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< Steven M. Kemp | Department of Psychology | email: steve_kemp at unc.edu Davie Hall, CB# 3270 | University of North Carolina | Chapel Hill, NC 27599-3270 | fax: (919) 962-2537 Visit our WebSite at: http://www.unc.edu/~skemp/ >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< The laws of mind [are] themselves of so fluid a character as to simulate divergences from law. -- C. S. Peirce (Collected Papers, 6.101). From mimura at kobe-kosen.ac.jp Sun Aug 17 23:57:20 2003 From: mimura at kobe-kosen.ac.jp (Kazushi Mimura) Date: Mon, 18 Aug 2003 12:57:20 +0900 Subject: paper available: synapse efficiency diverges due to pruning Message-ID: <200308180357.AA00220@GOLD01.kobe-kosen.ac.jp> Apologies for multiple postings. Dear, colleagues. I would like to announce the following paper available on the web site: http://www.kobe-kosen.ac.jp/~mimura/paper/pre2003.pdf ( or cond-mat/0207545 http://arxiv.org/abs/cond-mat/0207545 ) 'Synapse efficiency diverges due to synaptic pruning following over-growth.' by Mimura,K., Kimoto,T. & Okada,M. Physical Review E (in press). Abstract-------------------------------------------------------- In the development of the brain, it is known that synapses are pruned following over-growth. This pruning following over-growth seems to be a universal phenomenon that occurs in almost all areas -- visual cortex, motor area, association area, and so on. It has been shown numerically that the synapse efficiency is increased by systematic deletion. We discuss the synapse efficiency to evaluate the effect of pruning following over-growth, and analytically show that the synapse efficiency diverges as O(log c) at the limit where connecting rate c is extremely small. Under a fixed synapse number criterion, the optimal connecting rate, which maximize memory performance, exists. ---------------------------------------------------------------- Sincerely Yours, Kazushi Mimura Dept. of Electrical Engineering, Kobe City College of Technology phone: +81-78-795-3236 (direct line) fax: +81-78-795-3314 e-mail: mimura at kobe-kosen.ac.jp url: http://www.kobe-kosen.ac.jp/~mimura/ ---- Kazushi Mimura, Kobe-cct +81-78-795-3236 (direct) From beal at cs.toronto.edu Mon Aug 18 12:52:05 2003 From: beal at cs.toronto.edu (Matthew Beal) Date: Mon, 18 Aug 2003 12:52:05 -0400 Subject: PhD thesis available on Variational Bayes Message-ID: Dear Connectionists I would like to announce my thesis, some companion Matlab software, and a website dedicated to Variational Bayesian techniques. o My thesis "Variational Methods for Approximate Bayesian Inference" is available from http://www.cs.toronto.edu/~beal/papers.html o Software for VB Mixtures of Factor Analysers, VB Hidden Markov Models, and VB State Space Models (Linear Dynamical Systems) is available from http://www.cs.toronto.edu/~beal/software.html o Variational-Bayes.org: a repository of papers, software, and links related to the use of variational methods for approximate Bayesian learning http://www.variational-bayes.org We welcome your feed-back to help build this site. Below is an abstract and short contents of my thesis. Cheers -Matt ---------- Abstract: The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms. Chapter 2 forms the theoretical core of the thesis, generalising the expectation-maximisation (EM) algorithm for learning maximum likelihood parameters to the VB EM algorithm which integrates over model parameters. The algorithm is then specialised to the large family of conjugate-exponential (CE) graphical models, and several theorems are presented to pave the road for automated VB derivation procedures in both directed and undirected graphs (Bayesian and Markov networks, respectively). Chapters 3-5 derive and apply the VB EM algorithm to three commonly-used and important models: mixtures of factor analysers, linear dynamical systems, and hidden Markov models. It is shown how model selection tasks such as determining the dimensionality, cardinality, or number of variables are possible using VB approximations. Also explored are methods for combining sampling procedures with variational approximations, to estimate the tightness of VB bounds and to obtain more effective sampling algorithms. Chapter 6 applies VB learning to a long-standing problem of scoring discrete-variable directed acyclic graphs, and compares the performance to annealed importance sampling amongst other methods. Throughout, the VB approximation is compared to other methods including sampling, Cheeseman-Stutz, and asymptotic approximations such as BIC. The thesis concludes with a discussion of evolving directions for model selection including infinite models and alternative approximations to the marginal likelihood. ---------- Table of Contents: Chapter 1 Introduction 1.1 Probabilistic inference 1.2 Bayesian model selection 1.3 Practical Bayesian approaches 1.4 Summary of the remaining chapters Chapter 2 Variational Bayesian Theory 2.1 Introduction 2.2 Variational methods for ML / MAP learning 2.3 Variational methods for Bayesian learning 2.4 Conjugate-Exponential models 2.5 Directed and undirected graphs 2.6 Comparisons of VB to other criteria 2.7 Summary Chapter 3 Variational Bayesian Hidden Markov Models 3.1 Introduction 3.2 Inference and learning for maximum likelihood HMMs 3.3 Bayesian HMMs 3.4 Variational Bayesian formulation 3.5 Experiments 3.6 Discussion Chapter 4 Variational Bayesian Mixtures of Factor Analysers 4.1 Introduction 4.2 Bayesian Mixture of Factor Analysers 4.3 Model exploration: birth and death 4.4 Handling the predictive density 4.5 Synthetic experiments 4.6 Digit experiments 4.7 Combining VB approximations with Monte Carlo 4.8 Summary Chapter 5 Variational Bayesian Linear Dynamical Systems 5.1 Introduction 5.2 The Linear Dynamical System model 5.3 The variational treatment 5.4 Synthetic Experiments 5.5 Elucidating gene expression mechanisms 5.6 Possible extensions and future research 5.7 Summary Chapter 6 Learning the structure of discrete-variable graphical models with hidden variables 6.1 Introduction 6.2 Calculating marginal likelihoods of DAGs 6.3 Estimating the marginal likelihood 6.4 Experiments 6.5 Open questions and directions 6.6 Summary Chapter 7 Conclusion 7.1 Discussion 7.2 Summary of contributions Appendix A Conjugate Exponential family examples Appendix B Useful results from matrix theory B.1 Schur complements and inverting partitioned matrices B.2 The matrix inversion lemma Appendix C Miscellaneous results C.1 Computing the digamma function C.2 Multivariate gamma hyperparameter optimisation C.3 Marginal KL divergence of gamma-Gaussian variables ---------- From markman at psyvax.psy.utexas.edu Mon Aug 18 16:16:13 2003 From: markman at psyvax.psy.utexas.edu (Art Markman) Date: Mon, 18 Aug 2003 15:16:13 -0500 Subject: No subject Message-ID: John R. Anderson to Receive the David E. Rumelhart Prize for Contributions to the Formal Analysis of Human Cognition The Glushko-Samuelson Foundation and the Cognitive Science Society are pleased to announce that John R. Anderson has been chosen as the fourth recipient of the $100,000 David E. Rumelhart Prize, awarded annually for outstanding contributions to the formal analysis of human cognition. Anderson will receive this prize and give the Prize Lecture at the 26th Meeting of the Cognitive Science Society in Chicago, August 4-8, 2004. The David E. Rumelhart Prize The David E. Rumelhart Prize was created by the Glushko-Samuelson Foundation to honor David E. Rumelhart, a Cognitive Scientist who exploited a wide range of formal methods to address issues and topics in Cognitive Science. Perhaps best known for his contributions to connectionist or neural network models, Rumelhart also exploited symbolic models of human cognition, formal linguistic methods, and the formal tools of mathematics. Reflecting this diversity, the first three winners of the David E. Rumelhart Prize are individuals whose work lies within three of these four approaches. Past recipients are Geoffrey Hinton, a connectionist modeler, Richard M. Shiffrin, a mathematical psychologist, and Aravind Joshi, a formal and computational linguist. Anderson is the leading proponent of the symbolic modeling framework, thereby completing coverage of the four approaches. Research Biography of John R. Anderson John R. Anderson, Richard King Mellon Professor of Psychology and Computer Science at Carnegie Mellon University is an exemplary recipient for a prize that is intended to honor "a significant contemporary contribution to the formal analysis of human cognition". For the last three decades, Anderson has been engaged in a vigorous research program with the goal of developing a computational theory of mind. Anderson's work is framed within the symbol processing framework and has involved an integrated program of experimental work, mathematical analyses, computational modeling, and rigorous applications. His research has provided the field of cognitive psychology with comprehensive and integrated theories. Furthermore, it has had a real impact on educational practice in the classroom and on student achievement in learning mathematics. Anderson's contributions have arisen across a career that consists of five distinct phases. Phase 1 began when he entered graduate school at Stanford at a time when cognitive psychology was incorporating computational techniques from artificial intelligence. During this period and immediately after his graduation from Stanford, he developed a number of simulation models of various aspects of human cognition such as free recall [1]. His major contribution from this time was the HAM theory, which he developed with Gordon Bower. In 1973, he and Bower published the book Human Associative Memory [2], which immediately attracted the attention of everyone then working in the field. The book played a major role in establishing propositional semantic networks as the basis for representation in memory and spreading activation through the links in such networks as the basis for retrieval of information from memory. It also provided an initial example of a research style that has become increasingly used in cognitive science: to create a comprehensive computer simulation capable of performing a range of cognitive tasks and to test this model with a series of experiments addressing the phenomena within that range. Dissatisfied with the limited scope of his early theory, Anderson undertook the work which has been the major focus of his career to date, the development of the ACT theory [3]. ACT extended the HAM theory by combining production systems with semantic nets and the mechanism of spreading activation. The second phase of Anderson's career is associated with the initial development of ACT. The theory reached a significant level of maturity with the publication in 1983 of The Architecture of Cognition [4], which is the most cited of his research monographs (having received almost 2000 citations in the ensuing years). At the time of publication, The ACT* model described in this book was the most integrated model of cognition that had then been produced and tested. It has had a major impact on the theoretical development of the field and on the movement toward comprehensive and unified theories, incorporating separation of procedural and declarative knowledge and a series of mechanisms for production rule learning that became the focus of much subsequent research on the acquisition of cognitive skills. In his own book on Unified Theories of Cognition, Alan Newell had this to say: "ACT*, is in my opinion, the first unified theory of cognition. It has pride of place.... [It] provides a threshold of success which all other candidates... must exceed". Anderson then began a major program to test whether ACT* and its skill acquisition mechanisms actually provided an integrated and accurate account of learning. He started to apply the theory to development of intelligent tutoring systems; this defines the third phase of his research. This work grew from an initial emphasis on teaching the programming language LISP to a broader focus on high-school mathematics [5], responding to perceptions of a national crisis in mathematics education. These systems have been shown to enable students to reach target achievement levels in a third of the usual time and to improve student performance by a letter grade in real classrooms. Anderson guided this research to the point where a full high school curriculum was developed that was used in urban schools. Subsequently, a separate corporation has been created to place the tutor in hundreds of schools, influencing tens of thousands of students. The tutor curriculum was recently recognized by the Department of Education as one of five "exemplary curricula" nationwide. While Anderson does not participate in that company, he continues research developing better tools for tracking individual student cognition, and this research continues to be informed by the ACT theory. His tutoring systems have established that it is possible to impact education with rigorous simulation of human cognition. In the late 1980s, Anderson began work on what was to define the fourth phase of his research, which was an attempt to understand how the basic mechanisms of a cognitive architecture were adapted to the statistical structure of the environment. Anderson (1990) [6] called this a rational analysis of cognition and applied it to the domains of human memory, categorization, causal inference, and problem solving. He utilized Bayesian statistics to derive optimal solutions to the problems posed by the environment and showed that human cognition approximated these solutions. Such optimization analysis and use of Bayesian techniques have become increasingly prevalent in Cognitive Science. Subsequent to the rational analysis effort, Anderson has returned his full attention back to the ACT theory, defining the fifth and current phase of his career. With Christian Lebiere, he has developed the ACT-R theory, which incorporates the insights from his work on rational analysis [7]. Reflecting the developments in computer technology and the techniques learned in the applications of ACT*, the ACT-R system was made available for general use. A growing and very active community of well over 100 researchers is now using it to model a wide range of issues in human cognition, including dualtasking, memory, language, scientific discovery, and game playing. It has become increasingly used to model dynamic tasks like air-traffic control, where it promises to have training implications equivalent to the mathematics tutors. Through the independent work of many researchers, the field of cognitive science is now seeing a single unified system applied to an unrivaled range of tasks. Much of Anderson's own work on the ACT-R has been involved relating the theory to data from functional brain imaging [8]. In addition to his enormous volume of original work, Anderson has found the time to produce and revise two textbooks, one on cognitive psychology [9] and the other on learning and memory [10]. The cognitive psychology textbook, now in its fifth edition, helped define the course of study that is modern introductory cognitive psychology. His more recent learning and memory textbook, now in its second edition, is widely regarded as reflecting the new synthesis that is occurring in that field among animal learning, cognitive psychology, and cognitive neuroscience. Anderson has previously served as president of the Cognitive Science Society and has received a number of awards in recognition of his contributions. In 1978 he received the American Psychological Association's Early Career Award; in 1981 he was elected to membership in the Society of Experimental Psychologists; in 1994 he received APA's Distinguished Scientific Contribution Award; and in 1999 he was elected to both the National Academy of Sciences and the American Academy of Arts and Science. Currently, as a member of the National Academy, he is working towards bringing more rigorous science standards to educational research. 1. Anderson, J. R., & Bower, G. H. (1972). Recognition and retrieval processes in free recall. Psychological Review, 79, 97-123. 2. Anderson, J. R. & Bower, G. H. (1973). Human associative memory. Washington: Winston and Sons. 3. Anderson, J. R. (1976). Language, memory, and thought. Hillsdale, NJ: Erlbaum. 4. Anderson, J. R. (1983). The Architecture of Cognition. Cambridge, MA: Harvard University Press. 5. Anderson, J. R., Corbett, A. T., Koedinger, K., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of Learning Sciences, 4, 167-207. 6. Anderson, J. R. (1990). The Adaptive Character of Thought. Hillsdale, NJ: Erlbaum. 7. Anderson, J. R. & Lebiere, C. (1998). The atomic components of thought. Mahwah, NJ: Erlbaum. 8. Anderson, J. R., Qin, Y., Sohn, M-H., Stenger, V. A. & Carter, C. S. (2003.) An information-processing model of the BOLD response in symbol manipulation tasks. Psychonomic Bulletin & Review. 10, 241-261. 9. Anderson, J. R. (2000). Cognitive Psychology and Its Implications: Fifth Edition. New York: Worth Publishing. 10. Anderson, J. R. (2000). Learning and Memory, Second Edition. New York: Wiley. From calls at bbsonline.org Tue Aug 19 10:51:42 2003 From: calls at bbsonline.org (Behavioral & Brain Sciences) Date: Tue, 19 Aug 2003 15:51:42 +0100 Subject: Wegner/The Illusion of Conscious Will: BBS Multiple Book Review Message-ID: Below is a link to the forthcoming precis of a book accepted for Multiple Book Review in Behavioral and Brain Sciences (BBS). PRECIS OF: The Illusion of Conscious Will by Daniel M. Wegner PRECIS: http://www.bbsonline.org/Preprints/Wegner-05012003/Referees/ Please note that it is the *BOOK*, not the precis, that is to be reviewed. Behavioral and Brain Sciences (BBS), is an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be BBS Associates or nominated by a BBS Associate. To be considered as a reviewer for this book, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please reply by EMAIL within four (4) weeks to: calls at bbsonline.org The Calls are sent to over 10,000 BBS Associates, so there is no need to reply except if you wish to review this book, or to nominate someone to review. If you are not a BBS Associate, please approach a current BBS Associate who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. To help you decide whether you would be an appropriate reviewer for this book, an electronic draft of the precis is retrievable at the URL included in this email. ======================================================================= *** IMPORTANT *** Please do not prepare a review unless you are formally invited. To help us put together a balanced list of reviewers, it would be most helpful if you would send us as specific as possible an indication of the relevant expertise you would bring to bear on the subject, and what aspect of the book you would anticipate commenting upon. We will then let you know whether it was possible to include your name on the final formal list of invitees. NOTE: Please indicate whether you already have the book or would require a review copy to be sent to you if invited. ======================================================================= PRECIS OF: The Illusion of Conscious Will Daniel M. Wegner Department of Psychology Harvard University ABSTRACT: The experience of conscious will is the feeling that we're doing things. This feeling occurs for many things we do, conveying to us again and again the sense that we consciously cause our actions. But the feeling may not be a true reading of what is happening in our minds, brains, and bodies as our actions are produced. The feeling of conscious will can be fooled. This happens in clinical disorders such as alien hand syndrome, dissociative identity disorder, and schizophrenic auditory hallucinations. And in people without disorders, phenomena such as hypnosis, automatic writing, Ouija board spelling, water dowsing, facilitated communication, speaking in tongues, spirit possession, and trance channeling also illustrate anomalies of will-cases when actions occur without will, or will occurs without action. This book brings these cases together with research evidence from laboratories in psychology and neuroscience to explore a theory of apparent mental causation. According to this theory, when a thought appears in consciousness just prior to an action, is consistent with the action, and appears exclusive of salient alternative causes of the action, we experience conscious will and ascribe authorship to ourselves for the action. Experiences of conscious will thus arise from processes whereby the mind interprets itself-not from processes whereby mind creates action. Conscious will, in this view, is an indication that we think we have caused an action, not a revelation of the causal sequence by which the action was produced. KEYWORDS: conscious will; free will; determinism; apparent mental causation; automatism; perceived control PRECIS: http://www.bbsonline.org/Preprints/Wegner-05012003/Referees/ ====================================================================== *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password. Or, email a response with the word "remove" in the subject line. *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Thanks, Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://www.bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From baolshausen at ucdavis.edu Thu Aug 21 16:15:06 2003 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Thu, 21 Aug 2003 14:15:06 -0600 Subject: Network special issue Message-ID: <3F45284A.D6217341@ucdavis.edu> This month's issue of Network: Computation in Neural Systems is devoted to "Natural Scene Statistics and Neural Codes." It contains a partial collection of papers presented at the Gordon Research Conference on Sensory Coding in the Natural Environment held in summer 2002. The issue has been made available for free for a limited time at http://www.iop.org/EJ/journal/Network Bruno Olshausen and Pam Reinagel, guest editors P.S. Please note that the next Gordon Research Conference on Sensory Coding in the Natural Environment will be held Sept. 5-10, 2004 at The Queens College, Oxford, UK. See http://www.grc.org/ (program and registration info will be made available in January). -- Bruno A. Olshausen (530) 757-8749 Center for Neuroscience, UC Davis (530) 757-8827 (fax) 1544 Newton Ct. baolshausen at ucdavis.edu Davis, CA 95616 http://redwood.ucdavis.edu/bruno & Redwood Neuroscience Institute (650) 321-8282 x233 1010 El Camino Real, suite 380 (650) 321-8585 (fax) Menlo Park, CA 94025 http://www.rni.org From d.mareschal at bbk.ac.uk Fri Aug 22 08:10:49 2003 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Fri, 22 Aug 2003 12:10:49 +0000 Subject: special issue Message-ID: Dear all, Some of you may be interested in this recent special issue of Developmental Science comparing connectionist and dynamics systems approaches to development. Best, Denis Mareschal ------------------ Developmental Science Volume 4 Number 4 Contents Fast-Track Report Larger brains and white matter volumes in children with developmental language disorder Martha R. Herbert, David A. Ziegler, Nikos Makris, Anna Bakardjiev, James Hodgson, Kristen T. Adrien, David N. Kennedy, Pauline A. Filipek, and Verne S. Canives Jr. F11 Special Issue: Connectionist and Dynamic Systems Approaches to Development Introduction to the special issue: Why this question and why now? John P. spencer and Esther Thelen 375 Connectionism and dynamic systems: are they really different? Esther Thelen and Elizabeth Bates 378 Bridging the representational gap in dynamic systems approaches to development John P. Spencer and Gregor Schoner 392 Connectionist models of development Yuko Munakata and James L. McClelland 413 Development: it's about time Jeff Elman 430 Different is good: connectionism and dynamic systems theory are complementary emergenist approaches to development Linda B. Smith and Larissa K. Samuelson 434 Comparisons of developmental modeling frameworks and levels of analysis on cognition: connectionist and dynamic systems theories deserve attention, but don't yet explain attention Nelson Cowan 440 http://www.blackwellpublishing.com/journal.asp?ref=1363-755X -- -- ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 020 7631-6582/6207/6226 fax +44 020 7631-6312 http://www.psyc.bbk.ac.uk/people/academic/mareschal_d/ ================================================= From G.Brown at cs.bham.ac.uk Tue Aug 26 10:40:35 2003 From: G.Brown at cs.bham.ac.uk (Gavin Brown) Date: Tue, 26 Aug 2003 15:40:35 +0100 (BST) Subject: Ensemble Learning Bibliography Message-ID: Dear Connectionists, A note to let you know about a new online bibliography of ensemble learning references. Mostly focusing on neural networks, but is expanding. Currently holding over 250 entries, contributions in Bibtex format are welcomed. http://www.cs.bham.ac.uk/~gxb/ensemblebib.php Thanks, -Gavin Brown