preprint

Jong-Hoon Oh jong at miata.postech.ac.kr
Fri Jul 2 15:04:14 EDT 1993


FTP-host: archive.cis.ohio-state.edu
FTP-filename:  /pub/neuroprose/oh.generalization.ps.Z

The following paper has been placed in the Neuroprose archive
(see above for ftp-host)  in file 
 oh.generalization.ps.Z   (8 pages of output)


-----------------------------------------------------------------

Generalization in a two-layer neural network

Kukjin Kang, Jong-Hoon Oh
Department of Physics, Pohang Institute of Science and Technology,
Pohang, Kyongbuk, Korea
Chulan Kwon, Youngah Park
Department of Physics, Myong Ji University, Yongin,
Kyonggi, Korea

    Learning of a fully connected two-layer neural networks with $N$ input
nodes, $M$ hidden nodes and a single output node is studied using  the annealed
approximation. We study the generalization curve, i.e. the average
generalization error as a function of the number of the examples.   When the
number of examples is the order of $N$, the generalization error is rapidly
decreasing and the system is in a permutation symmetric(PS) phase. As the
number of examples $P$ grows to the order of $MN$ the generalization error
converges to a constant value. Finally the system undergoes a first order phase
transition to a perfect learning  and the permutation symmetry  breaks. The
computer simulations show a good agreement with analytic results. 

PACS number(s): 87.10.+e, 05.50.+s, 64.60.Cn

Jong-Hoon Oh
jhoh at miata.postech.ac.kr


-----------------------------------------------------------------



More information about the Connectionists mailing list