Encoding missing values

Luis B. Almeida lba at ilusion.inesc.pt
Mon Feb 7 04:57:07 EST 1994


Bill Skaggs writes:

  There is at least one kind of network that has no problem (in
  principle) with missing inputs, namely a Boltzmann machine.
  You just refrain from clamping the input node whose value is
  missing, and treat it like an output node or hidden unit.

  This may seem to be irrelevant to anything other than Boltzmann
  machines, but I think it could be argued that nothing very much
  simpler is capable of dealing with the problem.  When you ask
  a network to handle missing inputs, you are in effect asking it
  to do pattern completion on the input layer, and for this a
  Boltzmann machine or some other sort of attractor network would
  seem to be required.


The same effect, of trying to guess the missing inputs, can also be
obtained with a recurrent multilayer perceptron, trained with
recurrent backprop. This is the reason why the pattern completion
results that I described in my 1987 ICNN paper (ref. below) were
rather good.

L. B. Almeida, "A learning rule for asynchronous perceptrons with
feedback in a combinatorial environment", Proc IEEE First
International Conference on Neural Networks, San Diego, Ca., 1987.

Luis B. Almeida

INESC                             Phone: +351-1-544607, +351-1-3100246
Apartado 10105                    Fax:   +351-1-525843
P-1017 Lisboa Codex
Portugal

lba at inesc.pt

-----------------------------------------------------------------------------

      *** Indonesians are killing innocent people in East Timor ***



More information about the Connectionists mailing list