Preprint on Multiplicative Neural Networks
Michael Schmitt
mschmitt at lmi.ruhr-uni-bochum.de
Tue Apr 17 07:31:43 EDT 2001
Dear Colleagues,
a preprint of the paper
"On the complexity of computing and learning with multiplicative neural
networks"
by Michael Schmitt, to appear in Neural Computation,
is available on-line from
http://www.ruhr-uni-bochum.de/lmi/mschmitt/multiplicative.ps.gz
(63 pages gzipped PostScript).
Regards,
Michael Schmitt
------------------------------------------------------------
TITLE: On the Complexity of Computing and Learning with Multiplicative
Neural Networks
AUTHOR: Michael Schmitt
ABSTRACT
In a great variety of neuron models neural
inputs are combined using the summing operation. We introduce the
concept of multiplicative neural networks that contain units which
multiply their inputs instead of summing them and, thus, allow
inputs to interact nonlinearly. The class of multiplicative neural
networks comprises such widely known and well studied network types
as higher-order networks and product unit networks.
We investigate the complexity of computing and learning for
multiplicative neural networks. In particular, we derive upper and
lower bounds on the Vapnik-Chervonenkis (VC) dimension and the
pseudo dimension for various types of networks with multiplicative
units. As the most general case, we consider feedforward networks
consisting of product and sigmoidal units, showing that their pseudo
dimension is bounded from above by a polynomial with the same order
of magnitude as the currently best known bound for purely sigmoidal
networks. Moreover, we show that this bound holds even in the case
when the unit type, product or sigmoidal, may be learned. Crucial
for these results are calculations of solution set components bounds
for new network classes. As to lower bounds we construct product
unit networks of fixed depth with superlinear VC dimension.
For sigmoidal networks of higher order we establish polynomial
bounds that, in contrast to previous results, do not involve any
restriction of the network order. We further consider various
classes of higher-order units, also known as sigma-pi units, that
are characterized by connectivity constraints. In terms of these we
derive some asymptotically tight bounds.
Multiplication plays an important role both in neural modeling of
biological behavior and in computing and learning with artificial
neural networks. We briefly survey research in biology and in
applications where multiplication is considered an essential
computational element. The results we present here provide new tools
for assessing the impact of multiplication on the computational
power and the learning capabilities of neural networks.
--
Michael Schmitt
LS Mathematik & Informatik, Fakultaet fuer Mathematik
Ruhr-Universitaet Bochum, D-44780 Bochum, Germany
Phone: +49 234 32-23209 , Fax: +49 234 32-14465
http://www.ruhr-uni-bochum.de/lmi/mschmitt/
More information about the Connectionists
mailing list