Abstract, New Squashing function...
R. Srikanth
srikanth at rex.cs.tulane.edu
Sun Feb 21 14:41:45 EST 1993
>
> ABSTRACT
>
> A BETTER ACTIVATION FUNCTION FOR ARTIFICIAL NEURAL NETWORKS
>
> TR 93-8, Institute for Systems Research, University of Maryland
>
> by David L. Elliott-- ISR, NeuroDyne, Inc., and Washington University
> January 29, 1993
> The activation function s(x) = x/(1 + |x|) is proposed for use in
> digital simulation of neural networks, on the grounds that the
> computational operation count for this function is much smaller than
> for those using exponentials and that it satisfies the simple differential
> equation s' = (1 + |s|)^2, which generalizes the logistic equation.
> The full report, a work-in-progress, is available in LaTeX or PostScript
> form (two pages + titlepage) by request to delliott at src.umd.edu.
>
>
This squashing function while not widely in use, is and has been used by
few others. George Georgiou uses it for a complex back propagation network.
Not only does the activation function enable him to model a complex BP but
also seems to lend itself to easier implementation.
For more information on complex domain backprop, contact
Dr. George Georgiou at georgiou at meridian.csci.csusb.edu
--
srikanth at cs.tulane.edu
Dept of Computer Science,
Tulane University,
New Orleans, La - 70118
More information about the Connectionists
mailing list