From terry at salk.edu Mon Dec 1 20:29:45 2008 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 01 Dec 2008 17:29:45 -0800 Subject: Connectionists: NEURAL COMPUTATION - January, 2009 In-Reply-To: Message-ID: Neural Computation - Contents - Volume 21, Number 1 - January 1, 2009 NOTE Long-range Out-of-Sample Properties of Auto-regressive Neural Networks Patrick Leoni LETTERS Spike-Timing Error Backpropagation in Theta Neuron Networks Sam McKennoch, Thomas Voegtlin, and Linda Bushnell A Master Equation Formalism for Macroscopic Modeling of Asynchronous Irregular Activity States Sami El Boustani and Alain Destexhe A New Approach to Estimation of Attraction Domain for Hopfield-type Neural Networks Dequan Jin and Jigen Peng Density-Weighted Nystrom Method for Computing Large Kernel Eigensystems Kai Zhang and James Kwok Persistent Neural States: Stationary Localized Patterns in Nonlinear Continuous n-Population, Q-Dimensional Neural Networks Olivier Faugeras, Romain Veltz, Francois Grimbert Generation of Correlated Spike Trains Romain Brette Getting to Know Your Neighbors: Unsupervised Learning of Topography from Real-World, Event-Based Input Martin Boerlin, Tobias Delbruck, and Kynan Eng Decision-theoretic Saliency: Computational Principles, Biological Plausibility, and Implications for Neurophysiology and Psychophysics Dashan Gao and Nuno Vasconcelos Prototype Classification: Insights from Machine Learning Arnulf B. A. Graf, Olivier Bousquet, Gunnar Raetsch, and Bernhard Schoelkopf ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2009 - VOLUME 21 - 12 ISSUES USA/Canada Others Electronic only Student/Retired $60 $123 $54 Individual $110 $173 $99 Institution $849 $912 $756 MIT Press Journals, 238 Main Street, Suite 500, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu http://mitpressjournals.org/neuralcomp ----- From terry at salk.edu Tue Dec 2 22:08:19 2008 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 02 Dec 2008 19:08:19 -0800 Subject: Connectionists: NIPS proceedings - 50% discount In-Reply-To: Message-ID: Advances in Neural Information Procesing Systems 50% discount on the "blue book" during the month of December on the following years: NIPS 10 - 1997 conference NIPS 11 - 1998 conference NIPS 12 - 1999 conference NIPS 13 - 2000 conference NIPS 14 - 2001 conference NIPS 15 - 2002 conference NIPS 16 - 2003 conference mitpress.mit.edu Code: NIPS08 (enter at checkout) Discount: 50% Note: Restricted to North American sales only ----- From triesch at fias.uni-frankfurt.de Fri Dec 5 08:30:41 2008 From: triesch at fias.uni-frankfurt.de (Jochen Triesch) Date: Fri, 05 Dec 2008 14:30:41 +0100 Subject: Connectionists: Call for papers, Int. Conf. on Development and Learning (ICDL2009) Message-ID: <49392D01.9090301@fias.uni-frankfurt.de> 8th IEEE International Conference on Development and Learning (ICDL2009) Shanghai, June 5-7, 2009 http://www.icdl09.org/ ICDL is a multidisciplinary conference pertaining to all subjects related to the development and learning processes of natural and artificial systems, including perceptual, cognitive, behavioral, emotional and all other mental capabilities that are exhibited by humans, higher animals, and robots. Its visionary goal is to understand autonomous development in humans and higher animals in biological, functional, and computational terms, and to enable such development in artificial systems. ICDL strives to bring together researchers in neuroscience, psychology, artificial intelligence, robotics and other related areas to encourage understanding and cross-fertilization of latest ideas. Topics of interest include, but are not restricted to: (1) Biological and biologically inspired architectures and general principles of development (2) Neuronal, cortical and pathway plasticity (3) Autonomous generation of internal representation (4) Neural networks for development and learning (5) Dynamics in neural systems and neurodynamical modeling (6) Attention mechanisms and the development of attention skills (7) Visual, auditory, touch systems and their development (8) Motor systems and their development (9) Language acquisition & understanding through development (10) Multimodal integration through development (11) Conceptual learning through development (12) Motivation, value, reinforcement, and novelty (13) Emotions and their development (14) Learning and training techniques for assisting development (15) Development of reasoning skills (16) Models of developmental disorders (17) Development of social skills (18) Philosophical and social issues of development (19) Robots with development and learning skills (20) Using robots to study development and learning ICDL2009 will feature invited plenary talks by world-renowned speakers, a variety of special sessions aligned with the conference theme, pre-conference tutorials, as well as regular technical sessions, and poster sessions. In addition to full-paper submissions, ICDL 2009 accepts one-page abstract submissions to encourage late-breaking results or for work that is not sufficiently mature for a full paper. Organizing Committee General Chair: Juyang Weng, Michigan State University, USA General Co-Chairs: Tiande Shou, Fudan University, China Xiangyang Xue, Fudan University, China Program Chairs: Jochen Triesch, Frankfurt Institute for Advanced Studies, Germany Zhengyou Zhang, Microsoft Research, USA Publication Chair: Yilu Zhang, GM Research, USA Publicity Chair: Alexander Stoytchev, Iowa State University, USA Publicity Co-Chairs: Hiroaki Wagatsuma, RIKEN, Japan (Asia) Pierre-Yves Oudeyer, INRIA Bordeaux - Sud-Ouest, France (Europe) Gedeon De?k, University of California at San Diego, USA (North America) Local Organization Sub-Committee: Hong Lu (Chair), Rui Feng (Hotel), Cheng Jin (Web), Yuefei Guo (Finance), Wenqiang Zhang (Publication) Important Dates January 25, 2009: Special session and tutorial proposals February 8, 2009: Full papers April 19, 2009: Accept/Reject notification for full papers April 26, 2009: One-page poster abstracts May 3, 2009: Accept/Reject notification for poster abstracts May 10, 2009: Final camera-ready papers Sponsors IEEE Computational Intelligence Society Cognitive Science Society Microsoft Research From pli at richmond.edu Thu Dec 11 13:00:07 2008 From: pli at richmond.edu (Ping Li) Date: Thu, 11 Dec 2008 13:00:07 -0500 (EST) Subject: Connectionists: NSF call for participation Message-ID: <4499.146.186.161.41.1229018407.squirrel@146.186.161.41> Dear Colleagues, NSF has issued a ?Dear Colleague Letter? that invites applications for participation in a joint NSF/EPSRC ?sandpit? (interactive workshop). The sandpit is meant to be an intensive, interactive and free-thinking environment, where participants from a range of disciplines immerse themselves in collaborative thinking processes in order to construct innovative approaches to synthetic biology. Substantial funding is allocated for selected collaborative research projects arising from the sandpit. Synthetic Biology uses biological systems as the primary source of data, dynamics, and phenomena to fabricate devices that are based on natural living systems. For example, new tools for designing and controlling neural circuits can lead to engineering of a virtual brain with the goal of better understanding brain/behavior interactions and to new computer technology based on our understanding of brain processes. Cognitive science and neuroscience are essential to this goal. Anyone eligible to apply for funding from either the NSF or EPSRC is eligible to apply to attend the sandpit. Please read the Dear Colleague Letter at http://www.nsf.gov/pubs/2009/nsf09012/nsf09012.jsp and a fuller description of the sandpit, its aim and desired outcomes at http://www.epsrc.ac.uk/CallsForProposals/JointSyntheticBiology.htm . If you have questions, please contact Rita Teutonico Senior Advisor for Integrative Activities rteutoni at nsf.gov 703-292-7118 ---------- Ping Li, Ph.D. Program Director, Cognitive Neuroscience Division of Behavioral and Cognitive Sciences National Science Foundation 4201 Wilson Boulevard Arlington, VA 22230, USA http://www.nsf.gov/div/index.jsp?org=BCS Tel: (703) 292-8643 Email: p-li at nsf.gov On leave from: Professor of Psychology, Linguistics, and Asian Studies Department of Psychology & Center for Language Science Pennsylvania State University University Park, PA 16802, USA http://psych.la.psu.edu/directory/faculty-bios/li.html http://lsrg.psu.edu/people/faculty/li_ping.shtml Tel: (814) 863-3921 Email: pul8 at psu.edu From lamb at inf.ufrgs.br Tue Dec 9 10:41:41 2008 From: lamb at inf.ufrgs.br (Luis Lamb) Date: Tue, 09 Dec 2008 13:41:41 -0200 Subject: Connectionists: New book: Neural-Symbolic Cognitive Reasoning, from Springer's Cognitive Technologies Series Message-ID: <493E91B5.6090701@inf.ufrgs.br> Dear Connectionists, I would like to announce the publication of "Neural-Symbolic Cognitive Reasoning", by Artur S. d'Avila Garcez, Lu?s C. Lamb, and Dov M. Gabbay. Springer, 2009. http://www.springer.com/978-3-540-73245-7 About the book: Humans are often extraordinary at performing practical reasoning. There are cases where the human computer, slow as it is, is faster than any artificial intelligence system. Are we faster because of the way we perceive knowledge as opposed to the way we represent it? The authors address this question by presenting neural network models that integrate the two most fundamental phenomena of cognition: our ability to learn from experience, and our ability to reason from what has been learned. This book is the first to offer a self-contained presentation of neural network models for a number of computer science logics, including modal, temporal, and epistemic logics. By using a graphical presentation, it explains neural networks through a sound neural-symbolic integration methodology, and it focuses on the benefits of integrating effective robust learning with expressive reasoning capabilities. The book will be invaluable reading for academic researchers, graduate students, and senior undergraduates in computer science, artificial intelligence, machine learning, cognitive science and engineering. It will also be of interest to computational logicians, and professional specialists on applications of cognitive, hybrid and artificial intelligence systems. Keywords: Artificial intelligence; Artificial neural networks; Connectionist non-classical logics; Logic for computer science; Machine learning; Neural computation; Neural-symbolic integration; Neural-symbolic learning systems Table of Contents: ----------------- 1 Introduction 1.1 Motivation 1.2 Methodology and Related Work 1.3 Structure of the Book -- 2 Logic and Knowledge Representation 2.1 Preliminaries 2.2 Classical Logic 2.2.1 Propositional Logic 2.2.2 First-Order Logic 2.3 Nonclassical Logics 2.4 Nonmonotonic Reasoning 2.5 Logic Programming 2.5.1 Stable-Model and Answer Set Semantics 2.6 Discussion -- 3 Artificial Neural Networks 3.1 Architectures of Neural Networks 3.2 Learning Strategy 3.3 Recurrent Networks 3.4 Evaluation of Learning Models 3.5 Discussion -- 4 Neural-Symbolic Learning Systems 4.1 The CILP System 4.2 Massively Parallel Deduction in CILP 4.3 Inductive Learning in CILP 4.4 Adding Classical Negation 4.5 Adding Metalevel Priorities 4.6 Applications of CILP 4.7 Discussion -- 5 Connectionist Modal Logic 5.1 Modal Logic and Extended Modal Programs . 5.1.1 Semantics for Extended Modal Logic Programs 5.2 Connectionist Modal Logic 5.2.1 Computing Modalities in Neural Networks 5.2.2 Soundness of Modal Computation 5.2.3 Termination of Modal Computation 5.3 Case Study: The Muddy Children Puzzle 5.3.1 Distributed Knowledge Representation in CML 5.3.2 Learning in CML 5.4 Discussion -- 6 Connectionist Temporal Reasoning 6.1 Connectionist Temporal Logic of Knowledge 6.1.1 The Language of CTLK. 6.1.2 The CTLK Algorithm 6.2 The Muddy Children Puzzle (Full Solution) 6.2.1 Temporal Knowledge Representation 6.2.2 Learning in CTLK 6.3 Discussion -- 7 Connectionist Intuitionistic Reasoning 7.1 Intuitionistic Logic and Programs 7.2 Connectionist Intuitionistic Reasoning 7.2.1 Creating the Networks 7.2.2 Connecting the Networks 7.3 Connectionist Intuitionistic Modal Reasoning 7.4 Discussion -- 8 Applications of Connectionist Nonclassical Reasoning 8.1 A Simple Card Game 8.2 The Wise Men Puzzle 8.2.1 A Formalisation of the Wise Men Puzzle 8.2.2 Representing the Wise Men Puzzle Using CML 8.3 Applications of Connectionist Intuitionism 8.3.1 Representing the Wise Men Puzzle Using CIL 8.4 Discussion -- 9 Fibring Neural Networks 9.1 The Idea of Fibring 9.2 Fibring Neural Networks 9.3 Examples of the Fibring of Networks 9.4 Definition of Fibred Networks 9.5 Dynamics of Fibred Networks 9.6 Expressiveness of Fibred Networks 9.7 Discussion -- 10 Relational Learning in Neural Networks 10.1 An Example 10.2 Variable Representation 10.3 Relation Representation 10.4 Relational Learning 10.5 Relational Reasoning 10.6 Experimental Results 10.7 Discussion -- 11 Argumentation Frameworks as Neural Networks 11.1 Value-Based Argumentation Frameworks 11.2 Argumentation Neural Networks 11.3 Argument Computation and Learning 11.3.1 Circular Argumentation 11.3.2 Argument Learning 11.3.3 Cumulative (Accrual) Argumentation 11.4 Fibring Applied to Argumentation 11.5 Discussion -- 12 Reasoning about Probabilities in Neural Networks 12.1 Representing Uncertainty 12.2 An Algorithm for Reasoning about Uncertainty 12.3 The Monty Hall Puzzle 12.4 Discussion -- 13 Conclusions 13.1 Neural-Symbolic Learning Systems 13.2 Connectionist Nonclassical Reasoning 13.2.1 Connectionist Modal Reasoning 13.2.2 Connectionist Temporal Reasoning 13.3 Fibring Neural Networks 13.4 Concluding Remarks -- References Index http://www.springer.com/978-3-540-73245-7 From h.jaeger at jacobs-university.de Mon Dec 1 10:25:26 2008 From: h.jaeger at jacobs-university.de (Herbert Jaeger) Date: Mon, 01 Dec 2008 16:25:26 +0100 Subject: Connectionists: Postdoc position in machine learning / neural modeling Message-ID: <493401E6.5090708@jacobs-university.de> A fully funded Postdoctoral Research Position in Machine Learning / Mathematical Modelling of Neural Systems is open in the research group of Herbert Jaeger (http://www.faculty.jacobs-university.de/hjaeger) at Jacobs University Bremen, Germany (http://www.jacobs-university.de). The position is created in the context of the integrated project "Self-organized Recurrent Neural Learning for Language Processing (ORGANIC), funded by the European Commission within the 7th Framework Program "Cognitive Systems, Interaction, Robotics". Details about the project and the consortium can be found at the preliminary reservoir computing website (http://reservoir-computing.org). The project start is April 1, 2009 and the project duration is 3 years. The position may start earlier than the project start. The projected research for this position concerns the architecture and learning algorithm design for large-scale recurrent neural network systems, with an emphasis on mathematical analysis. The targetted application area is speech and handwriting recognition. The project's overarching objective is to amalgamate neurobiological with engineering/mathematical perspectives, integrating a multitude of learning/adaptation/stabilization mechanisms for robustness and versatility. The ideal candidate would have the following qualifications: * a PhD in machine learning, computational neuroscience, mathematics, theoretical physics, signal processing, control engineering, or similar fields, * a strong mathematical background, especially in statistics and stochastic processes, nonlinear dynamics, and signals and systems, * experience in machine learning, signal processing, nonlinear dynamics, recurrent neural networks, * highly developed communication and organization skills, * a very good command of English, * endless curiosity. Besides scientific work (80%), the task profile includes assistance in project management. This aspect of the position will extend the professional skills of the researcher in important directions of large-scale project coordination and leadership qualifications. The total contract duration will be 3 years (project runtime, starting April 1) plus optionally the time from an earlier contract start to the project start. The application deadline is open. Jacobs University Bremen is an equal opportunity employer and has been certified ?Family Friendly? by the Hertie-Stiftung. Please send in electronic format (PDF preferred) a letter of application, CV, a copy of academic certificates, and optionally samples of published work to Herbert Jaeger (h.jaeger ?at? jacobs-university.de), who also invites further inquiries. Applicants passing an initial screening will be requested to supply two letters of reference. ----------------------------------------------------------------- Dr. Herbert Jaeger Professor for Computational Science Jacobs University Bremen gGmbH Campus Ring 28759 Bremen, Germany Phone (+49) 421 200 3215 Fax (+49) 421 200 49 3215 email h.jaeger at jacobs-university.de http://www.faculty.jacobs-university.de/hjaeger/ ------------------------------------------------------------------ From juergen at idsia.ch Wed Dec 24 07:12:37 2008 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Wed, 24 Dec 2008 12:12:37 -0000 Subject: Connectionists: Driven by compression progress Message-ID: Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes Based on keynote talk for KES 2008 and joint invited lecture for ALT 2007 / DS 2007 I argue that data becomes temporarily interesting by itself to some self-improving, but computationally limited, subjective observer once he learns to predict or compress the data in a better way, thus making it subjectively simpler and more `beautiful.' Curiosity is the desire to create or discover more non-random, non-arbitrary, regular data that is novel and surprising not in the traditional sense of Boltzmann and Shannon but in the sense that it allows for compression progress because its regularity was not yet known. This drive maximizes interestingness, the first derivative of subjective beauty or compressibility, that is, the steepness of the learning curve. It motivates exploring infants, pure mathematicians, composers, artists, dancers, comedians, yourself, and recent artificial systems. arXiv preprint: http://arXiv.org/abs/0812.4360 Overview site with previous papers on the theory of surprise & interestingness & attention & curiosity etc: http://www.idsia.ch/~juergen/interest.html Juergen Schmidhuber, TUM & IDSIA http://www6.in.tum.de/Main/Schmidhu http://www.idsia.ch/~juergen/ PS: Soon we will announce jobs related to this topic.