2 new papers on circuit theory, digital signal processing and recurrent neural networks

Paolo Campolucci paolo at eealab.unian.it
Fri Sep 15 13:42:22 EDT 2000


Dear Colleagues,
We would like to announce the availability of two new papers of potential
interest to people working on circuit theory, digital signal processing
and recurrent neural networks at

          http://nnsp.eealab.unian.it/Campolucci_P

Sincerely,
Paolo


  "A Signal-Flow-Graph Approach to On-line Gradient Calculation"

      By Paolo Campolucci, Aurelio Uncini, Francesco Piazza

        Neural Computation, vol. 12, nr. 8, August 2000

                            ABSTRACT

A large class of non-linear dynamic adaptive systems such as dynamic
recurrent neural networks can be very effectively represented by
Signal-Flow-Graphs (SFGs). By this method, complex systems are described
as a general connection of many simple components, each of them
implementing a simple one-input one-output transformation, as in an
electrical circuit. Even if graph representations are popular in the
neural network community, they are often used for qualitative description
rather than for rigorous representation and computational purposes.
In this paper, a method for both on-line and batch backward gradient
computation of a system output or cost function with respect to system
parameters is derived by the Signal-Flow-Graph representation theory and
its known properties. The system can be any causal, in general non-linear
and time-variant, dynamic system represented by a SFG, in particular any
feedforward, time delay or recurrent neural network. In this work, we use
discrete time notation, but the same theory holds for the continuous time
case. The gradient is obtained in a straightforward way by the analysis of
two SFGs, the original one and its adjoint (obtained from the first by
simple transformations) without the complex chain rule expansions of
derivatives usually employed.
This method can be used for sensitivity analysis and for learning both
off-line and on-line. On-line learning is particularly important since it
is required by many real applications such as Digital Signal Processing,
system identification and control, channel equalization and predistortion.



******************************************************************



"Intrinsic Stability Control Method for Recursive Filters and Neural
Networks"

             By Paolo Campolucci and Francesco Piazza

     IEEE Transactions on Circuits and Systems, part II: Analog and
digital signal processing, vol. 47, nr. 8, August 2000

                             ABSTRACT

Linear recursive filters can be adapted on-line but with instability
problems. Stability control techniques exist but they are either
computationally expensive or non-robust. For the non-linear case, e.g.
locally recurrent neural networks, the stability of Infinite Impulse
Response (IIR) synapses is often a condition to be satisfied.
This paper considers the known reparametrization-for-stability method for
the on-line adaptation of IIR adaptive filters. A new technique is also
presented, based on the further adaptation of the squashing function,
which allows to improve the convergence performance. The proposed method
can be applied to various filter realizations (direct forms, cascade or
parallel of second order sections, lattice form) as well as to locally
recurrent neural networks, such as the IIR Multi-Layer Perceptron
(IIR-MLP), with improved performance with respect to other techniques and
to the case of no stability control. In the paper the case of normalized
lattice filters is particularly considered; an analysis of the
stabilization effects is also presented both analytically and
experimentally.

=================================
Paolo Campolucci, PhD
Dipartimento di Elettronica ed Automatica
Universita' di Ancona, Italy

e-mail: paolo at eealab.unian.it or campoluc at tiscalinet.it

http://nnsp.eealab.unian.it/Campolucci_P
=================================







More information about the Connectionists mailing list