credits

thanasis kehagias ST401843%BROWNVM.BITNET at VMA.CC.CMU.EDU
Thu Mar 16 22:19:54 EST 1989


recently i posted a note about traing of HMM and Connectionist Networks,
where i was not careful enough in giving credit to people that deserved
it. let me try to make up for it:

i had a very interesting exchange of mesages with Tony Robinson, that
formed the basis for my note.

i received messages with ideas and references from Mark Plumbley, Steven
Nowlan, Sue Becker and Sara Solla. Sara Solla referred me to a paper
written by Solla, Esther Levin and Michael Fleisher, that deals with the
question of cross entropy. i received a copy of this paper today. it is:

 "Accelerated Learning in Layered Neural Networks", by S. Solla, , E. Levin and
 M. Fleisher, Complex Systems, Vol. 2, 1988.



the paper compares cross entropy  and square error and includes a
numerical study and a study of the shape of the contours of these cost
functions. therefore, the similar question i posed at the end of my note
is at least partly answered.

i also received the revised copy of G. Hinton's report on Connectionist
learning procedures, referred to in  my note. in this report (Dec. 1987)
Hinton has already made a remark directly related to  my point of
maximinimizing likelihood in the BF algorithm. specifically, he says
that (in the context of CN training with cross entropy cost function)
Likelihood is maximized when cross entropy is minimized.

i think this is all. if i have missed someting , let me know about it .{


                            Thanasis


More information about the Connectionists mailing list