"Transformations" tech report
Eric Mjolsness
mjolsness-eric at YALE.ARPA
Tue Mar 7 21:23:16 EST 1989
A new technical report is available:
"Algebraic Transformations of Objective Functions"
(YALEU/DCS/RR-686)
by Eric Mjolsness and Charles Garrett
Yale Department of Computer Science
P.O. 2158 Yale Station
New Haven CT 06520
Abstract:
A standard neural network design trick reduces the number of connections
in the winner-take-all (WTA) network from O(N^2) to O(N). We explain the
trick as a general fixpoint-preserving transformation applied to the
particular objective function associated with the WTA network. The key
idea is to introduce new interneurons which act to maximize the objective,
so that the network seeks a saddle point rather than a minimum. A number
of fixpoint-preserving transformations are derived, allowing the
simplification of such algebraic forms as products of expressions,
functions of one or two expressions, and sparse matrix products. The
transformations may be applied to reduce or simplify the implementation of
a great many structured neural networks, as we demonstrate for inexact
graph-matching, convolutions and coordinate transformations, and sorting.
Simulations show that fixpoint-preserving transformations may be applied
repeatedly and elaborately, and the example networks still robustly
converge. We discuss implications for circuit design.
To request a copy, please send your physical address by e-mail to
mjolsness-eric at cs.yale.edu
OR mjolsness-eric at yale.arpa (old style)
Thank you.
-------
More information about the Connectionists
mailing list