Contents of Neurocomputing 22 (1998)

Georg Thimm thimm at idiap.ch
Fri Nov 20 08:27:54 EST 1998


Dear reader,

Please find below a compilation of the contents for Neurocomputing and Scanning
the Issue written by V. David Snchez A.  More information on the journal are
available at the URL http://www.elsevier.nl/locate/jnlnr/05301 .

The contents of this and other journals published by Elsevier are distributed
also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect).

Please feel free to redistribute this message. My apologies if this message
is inappropriate for this mailing list; I would appreciate a feedback.


With kindest regards,

     Georg Thimm


Dr. Georg Thimm
Research scientist &                         WWW: http://www.idiap.ch/~thimm
Current Events Editor of Neurocomputing      Tel.: ++41 27 721 77 39 (Fax: 12)
IDIAP / C.P. 592 / 1920 Martigny / Suisse    E-mail: thimm at idiap.ch

********************************************************************************

Journal : NEUROCOMPUTING
ISSN : 0925-2312
Vol./Iss. : 22 / 1-3

The nonlinear PCA criterion in blind source separation:
Relations with other approaches
Karhunen , Juha
pp.: 5-20

Blind separation of convolved mixtures in the frequency
domain
Smaragdis , Paris
pp.: 21-34

Blind source separation using algorithmic information theory
Pajunen , Petteri
pp.: 35-48

Independent component analysis in the presence of Gaussian
noise by maximizing joint likelihood
Hyva"rinen , Aapo
pp.: 49-67

Learned parametric mixture based ICA algorithm
Xu , Lei
pp.: 69-80

Bayesian Kullback Ying--Yang dependence reduction theory
Xu , Lei
pp.: 81-111

Robust techniques for independent component analysis (ICA)
with noisy data
Cichocki , A.
pp.: 113-129

Searching for a binary factorial code using the ICA framework
Palmieri , Francesco
pp.: 131-144

Constrained PCA techniques for the identification of common
factors in data
Charles , Darryl
pp.: 145-156

A method of blind separation for convolved non-stationary
signals
Kawamoto , Mitsuru
pp.: 157-171

Removing artifacts from electrocardiographic signals using
independent components analysis
Barros , Allan Kardec
pp.: 173-186

>From neural learning to independent components
Oja , Erkki
pp.: 187-199

A nonlinear model of the binaural cocktail party effect
Girolami , Mark
pp.: 201-215


********************************************************************************

			    Neurocomputing 22 (1998)
			       Scanning the issue

In The nonlinear PCA criterion in blind source separation: Relations with
other approaches J. Karhunen, P. Pajunen and E. Oja derive the nonlinear
Principal Component Analysis (PCA) in Blind Source Separation (BSS) in a
form appropriate for comparison with other BSS and Independent Component
Analysis (ICA). The choice of the optimal nonlinearity is explained.

P.  Smaragdis presents Blind separation of convolved mixtures in the
frequency domain using information theoretic algorithms.  Improved
efficiency and better convergence are accomplished by the proposed approach
which shows clear advantages when compared to its time-domain
counterparts. The filter parameters in the frequency domain are orthogonal
to each other, therefore one can update one parameter without any influence
on the rest of parameters.

P. Pajunen describes Blind source separation using algorithmic information
theory. The algorithmic complexity of the sources and the mixing mapping is
measured. Natural signals meet often the requirement of low complexity. An
experiment consisting of separating correlated signals is discussed.

A Hyvrinen presents Independent component analysis in the presence of
Gaussian noise by maximizing joint likelihood. In the presence of noise the
relationship between observed data and the estimates of the independent
components is non-linear. For supergaussian (sparse) data the nonlinearity
can be approximated by a shrinkage operation and realized using competitive
learning. For subgaussian components anti-competitive learning can be used.

L. Xu, C.C. Cheung and S. Amari describe a Learned parametric mixture based
ICA algorithm. It is based on linear mixture and its separation capability
is shown to be superior to the original model with prefixed
nonlinearity. Experiments with subgaussian, supergaussian and combinations
of these types of sources confirm the applicability of the algorithm. L. Xu
presents the (BKYY-DR) Bayesian Kullback Yingang Dependence Reduction
theory. In particular the solution of the Blind Source Separation (BSS)
problem is addressed. Algorithms and criteria for parameter learning and
model selection are based on stochastic approximation. They are given for a
general BKYY-DR system and its three typical architectures. Experiments
with binary sources are reported.

A.  Cichocki, S.C.  Douglas and S. Amari propose Robust techniques for
independent component analysis (ICA) with noisy data.  A recurrent dynamic
neural network architecture is introduced for simultaneous unbiased
estimation of the unknown mixing matrix, blind source separation and noise
reduction in the extracted output signals. The shape parameters of the
nonlinearities are adjusted using gradient-based rules.

F. Palmieri, A. Budillon, M. Calabrese and D. Mattera present Searching for
a binary factorial code using the ICA framework.  Independent Component
Analysis (ICA) is formulated as density search and used for finding a
mapping of the input space into a binary string with independent bits, i.e.
a binary factorial code. Experiments with a real image show the feasibility
of the approach whose results are compared with the ones of a Principal
Component Analyzer (PCA).

D. Charles describes Constrained PCA techniques for the identification of
common factors in data. An unsupervised learning network is presented that
operates similarly to Principal Factor Analysis. The network responds to
the covariance of the input data. Extensions include preprocessing and a
function that supports sparseness at the network output.

M.  Kawamoto, K.  Matsuoka and N.  Ohnishi present A method of blind
separation for convolved non-stationary signals.  The method extracts
non-stationary signals from their convolutive mixtures. The minimization of
Matsuoka's cost function is performed. Simulations confirm the feasibility
of the method. The cases number of observed signals greater or equal to the
number of source signals are analyzed.

A.K. Barros, A. Mansour and N. Ohnishi describe Removing artifacts from ECG
signals using independent component analysis. An architecture is proposed
consisting of a high-pass filter, a two-layer network based on the
Independent Component Analysis (ICA) algorithm, and a self-adaptive
step-size. Simulations using a standard ECG database are performed.

E. Oja presents From neural learning to independent components. Emphasis is
placed on the connection between independent Component Analysis (ICA) and
neural learning, especially constrained Hebbian learning. The latter is a
non-linear extension of Principal Component Analysis (PCA).  Results are
given on stationary points and their asymptotic stability.

M.  Girolami describes A nonlinear model of the binaural cocktail party
effect. The network is based on a temporal anti-Hebbian adaptive maximum
likelihood estimator and its operation is similar to many approaches for
blind source separation of convolutive mixtures. Impressive results are
obtained when using simulations.  Favorable results are obtained under
realistic unconstrained acoustic conditions when compared with current
adaptive filters.

I appreciate the cooperation of all those who submitted their work for
inclusion in this issue.

V. David Snchez A.  
Editor-in-Chief


More information about the Connectionists mailing list