~m
Guo-Zheng Sun
sun at umiacs.UMD.EDU
Tue Nov 13 14:11:28 EST 1990
From ml-connectionists-request at Q.CS.CMU.EDU Tue Nov 13 03:09:35 1990
Received: from skippy.umiacs.umd.edu
by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88)
id AA00957; Tue, 13 Nov 90 03:09:34 -0500
Received: from Q.CS.CMU.EDU
by skippy.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88)
id AA06534; Tue, 13 Nov 90 03:09:31 -0500
Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa22519; 12 Nov 90 14:11:22 EST
Received: from cs.cmu.edu by Q.CS.CMU.EDU id aa22517; 12 Nov 90 14:05:09 EST
Received: from neudec.umiacs.umd.edu by CS.CMU.EDU id aa20292;
12 Nov 90 14:00:08 EST
Received: by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88)
id AA00284; Mon, 12 Nov 90 13:59:59 -0500
Date: Mon, 12 Nov 90 13:59:59 -0500
From: Guo-Zheng Sun <sun at umiacs.UMD.EDU>
Message-Id: <9011121859.AA00284 at neudec.umiacs.UMD.EDU>
To: connectionists at CS.CMU.EDU
Status: RO
>I am interested in constructive algorithms for neural networks
>(capable of generating new units and layers in the NN). Is there
>any constructive algorithm for NN whose units have continuous
>(e.g.: sigmoidal) activation function?
The works of this kind back to 1987 may also include:
(1)G.Z. Sun, H.H. Chen and Y.C. Lee, "A Novel Net that Learns Sequential
Decision Processes", NIP Proceedings (1987),Denver.
(2) G.Z. Sun, H.H.Chen and Y.C.Lee, "Parallel Sequential Induction Network:
A New Paradiagm of Neural Network Architecture", ICNN Proceedings,(1988),
San Diego.
These two papers generalized the idea of conventional regression tree to
generate the new NN units dynamically. The continuous activation
fuction is used. This PSIN (Parallel Sequential Induction Net) algorithm
integrates the parallel processing of neural net and sequential processing
of decision tree together to form a "most efficient" multilayered neural net
classifier. The efficiency measure is the information entropy.
-G. Z. Sun
University of Maryland
this is a test of mail tilde escape
More information about the Connectionists
mailing list