NEW Machine Learning Volume

Stephen Hanson jose at learning.siemens.com
Thu Apr 7 16:46:03 EDT 1994


This is a new volume just published that may be of interest to you:


COMPUTATIONAL LEARNING THEORY and NATURAL LEARNING SYSTEMS
Constraints and Prospects

MIT/BRADFORD 1994.

Editors, S. Hanson, G. Drastal, R. Rivest

Table of Contents


FOUNDATIONS


 Daniel Osherson, Massachusetts Institute of Technology, Michael Stob,
Calvin College, and Scott Weinstein, University of Pennsylvania.  {em Logic
and Learning}

 Ranan Banerji, Saint Joseph's University. {em Learning Theoretical 
Terms}

 Stephen Judd, Siemens Corporate Research, {em How Network Complexity
is Affected by Node Function Sets}

 Diane Cook, University of Illinois. {em Defining the Limits of
Analogical Planning}


REPRESENTATION and BIAS


 Larry Rendell and Raj Seshu, University of Illinois. {em Learning Hard
Concepts Through Constructive Induction: Framework and Rationale}

 Harish Ragavan and Larry Rendell, University of Illinois. {em The
Utility of Domain Knowledge for Learning Disjunctive Concepts}

 George Drastal, Siemens Corporate Research. {em Learning in an
Abstraction Space}

 Raj Seshu, University of Denver. {em Binary Decision Trees and an 
``Average-Case'' Model for Concept Learning: Implications for Feature
Construction and the Study of Bias}

 Richard Maclin and Jude Shavlik, University of Wisconsin, Madison. 
{em Refining Algorithms with Knowledge-Based Neural Networks: Improving
the Chou-Fasman Algorithm for Protein Folding}


SAMPLING PROBLEMS


 Michael Kearns and Robert Schapire, Massachusetts Institute of
Technology.
{em Efficient Distribution-free Learning of Probabilistic Concepts}

 Marek Karpinski and Thorsten Werther, University of Bonn. {em VC
Dimension and Sampling Complexity of Learning Sparse Polynomials and
Rational Functions}

 Haym Hirsh and William Cohen, Rutgers University. {em Learning from
Data with Bounded Inconsistency:Theoretical and  Experimental Results}

 Wolfgang Maass and Gyorgy Turan, University of Illinois. {em How Fast
Can a Threshold Gate Learn?}

 Eric Baum, NEC Research Institute. {em When are k-Nearest Neighbor and
Back Propagation Accurate for Feasible Sized Sets of Examples?}


EXPERIMENTAL

 Ross Quinlan, University of Sydney. {em Comparing Connectionist and
Symbolic Learning Methods}

 Andreas Weigend and David Rumelhart, Stanford University.
{em Weight-Elimination and Effective Network Size}

 Ronald Rivest and Yiqun Yin, Massachusetts Institute of Technology.
{em Simulation Results for a New Two-Armed Bandit Heuristic}

 Susan Epstein, Hunter College. {em Hard Questions About Easy Tasks:
Issues From Learning to Play Games}

 Lorien Pratt, Rutgers University. {em Experiments on the Transfer of 
Knowledge between Neural Networks}





Stephen J. Hanson, Ph.D.
Head, Learning Systems Department
SIEMENS Research 
755 College Rd. East
Princeton, NJ 08540




More information about the Connectionists mailing list