paper available: Approximation by neural networks is not continuous

Paul Kainen kainen at cs.umd.edu
Mon Mar 2 23:02:50 EST 1998


 
  Dear Colleagues,
 
  The paper described below is accessible via the web at
  
  http://www.clark.net/pub/kainen/not-cont.ps
 
  It is 10 pages printed, 174 KB; sorry, hard copy not available.
  The paper has been submitted for a special issue of a journal.
 
  
  
  Approximation by neural networks is not continuous
 
  Paul C. Kainen, Vera Kurkova and Andrew Vogt
 
 
  It is shown that in a Banach space X satisfying mild conditions,
  for an infinite, independent subset G, there is no continuous
  best approximation map from X to the n-span, span_n G.  The
  hypotheses are satisfied when X is an L_p space, 1 < p < \infty,
  and G is the set of functions computed by the hidden units of a
  typical neural network (e.g., Gaussian, Heaviside or hyperbolic
  tangent).  If G is finite and span_n G is not a subspace of X,
  it is also shown that there is no continuous map from X to span_n G
  within any positive constant of a best approximation.
 
  Keywords.  nonlinear approximation, one-hidden-layer neural network,
  rates of approximation, continuous selection, metric projection,
  proximinal set, Chebyshev set, n-width, geometry of Banach spaces.
 
 
  kainen at gumath1.math.georgetown.edu
  vera at uivt.cas.cz
  andy at gumath1.math.georgetown.edu  


More information about the Connectionists mailing list