NNs and Info. Th.
Fry, Robert L.
FRYRL at f1groups.fsd.jhuapl.edu
Tue Jul 25 15:18:00 EDT 1995
New neuroprose entry:
A paper entitled "Rational neural models based on information theory"
will be preseted at the Fifteenth International Workshop on MAXIMUM
ENTROPY AND BAYESIAN METHODS, in Sante Fe, New Mexico on July 31 - August 4,
1995. The enclosed abstract summarizes the presentation which describes an
information-theoretic explanation of some spatial and temporal aspects of
neurological information processing.
Author: Robert L. Fry
Affiliation: The Johns Hopkins University/Applied Physics Laboratory
Laurel, MD 20723
Title: Rational neural models based on information theory
Abstract
Biological organisms which possess a neurological system exhibit
varying degrees of what can be termed rational behavior. One can
hypothesize that rational behavior and thought processes in general arise as
a consequence of the intrinsic rational nature of the neurological system
and its constituent neurons. A similar statement may be made of the
immunological system [1]. The concept of rational behavior can be made
quantitative. In particular, one possible characterization of rational
behavior is as follows
(1) A physical entity (observer) must exist which has the capacity for both
measurement and the generation of outputs (participation). Outputs
represent decisions on the part of the observer which will be seen to be
rational.
(2) The establishment of the quantities measurable by the observer is
achieved through learning. Learning characterizes the change in knowledge
state of an observer in response to new information and is driven by the
directed divergence information measure of Kullback [2].
(3) Output decisions must be made optimally on the basis of noisy and/or
missing input data. Optimally here implies that the decision-making process
must abide by the standard logical consistency axioms which give rise to
probability as the only logically consistent measure of degree of plausible
belief. An observer using decision rules based on such is said to be
rational.
Information theory can be used to quantify the above leading to
computational paradigms with architectures that closely resemble both the
single cortical neuron and interconnected planar field of multiple cortical
neurons all of which are functionally identical to one another. A working
definition of information in a neural context must be agreed upon prior to
this development, however. Such a definition can be obtained through the
Laws of Form - a mathematics of observation originating with the British
mathematician George Spencer-Brown [3].
[1] Francisco J. Varela, Principles of Biological Autonomy, North Holland,
1979.
[2] Solomon Kullback, Information theory and statistics, Wiley, 1959 and
Dover, 1968.
[3] George Spencer-Brown, Laws of Form, E. P. Dutton, New York 1979
The paper is in compressed postscript format via FTP from
archive.cis.ohio-state.edu
/pub/neuroprose/fry.maxent.ps.Z
using standard telnet or other FTP procedures
More information about the Connectionists
mailing list