MUME version 0.6 is available
Multi-Module Environment
mm at sedal.sedal.su.OZ.AU
Sat Apr 24 11:57:50 EDT 1993
MUME 0.6 IS NOW AVAILABLE
The Multi-Module Neural Computing Environment (MUME) version 0.6 (sources
only) is now available.
MUME-0.6 compiles on a variety of Unix machines as well on the Fujitsu
VP2200 and PCs (MSDOS 5.0 or higher and using DJGCC).
HOW TO GET IT
-------------
It can be acquired by fetching the licence file:
file license.ps (postrscript file)
machine 129.78.13.39
directory: /pub
login: anonymous
password: your email address
and getting it signed by an authorised person and then sending/faxing it to
MUME 0.6
SEDAL
Sydney University Electrical Engineering
NSW 2006 Australia
Fax: (+61-2) 660-1228
The machine/account/passwd where you can ftp the sources will then be
communicated to you.
*** PLEASE DO NOT FORGET TO WRITE YOUR EMAIL CONTACT ON THE FAXED LICENSE ***
PC USERS
--------
If you don't have the DJGCC compiler, you can write to the address above
with a signed license and a cheque for A$150 (for media/doc/postage) and
we will forward to you the software and binaries. Do not forget to clearly
specify the media (3.5" or 5"1/4) and your surface mail address. Note MUME
compiled under DJGCC will not run under Microsoft's Windows.
MAILING LIST
------------
Once you have the software, you can ask to be include on a mailing list by
sending your email address to
mume-request at sedal.su.oz.au
MORE INFO ABOUT MUME OR CHANGES
-------------------------------
If you don't know what MUME is, you can fetch the file /pub/mume-overview.ps.Z
from 129.78.13.39 (login as anonymous).
Otherwise here is a copy of the CHANGES file (from version 0.5 to 0.6):
o A detailled basic tutorial has been written (directory tutes/tut0)
o To simplify interconnections statements between nets, MUME
now generates default "iface"s. for example for an MLP called
john, MUME automatically generates the interfaces john.in (input
layer) and john.out (output layer). This applies to most nets.
The enhancement of the interconnection semantic has been simplified
even further by introducing "base" index which simplifies the
neuron reference. All information about interfaces is now
described in a separate manual page called IFACE.5.
o The configuration files can make use of symbols which can be set
in the file or on the command line of the front-end program (see
man pages SYMBOLS.5 and MMN.5). Some nets now also define their
own symbols (eg. "mlp" net).
o The specification of neuron index for "nfun" keyword has been
emhanced to allow easier indexing (see NET.5).
o All front ends now default to a "test" mode. To train the switch
"-train" is required.
o Data reading routines of ENV net have been optimised
o Data normalisation statements in ENV have been modified (see man
pages ENV.5 and NORM.5)
o MUME now supports the use of a validation set during training. The
main purpose of a validation set is to prevent overtraining, as the
error on both the training and validation sets can be tracked as
training progresses.
To use the validation set, set the optional "Validate" flag in the
system definition section to 1 (using the statement "Validate 1;")
and specify a validation data set in all ENV modules (using
the statement "data Validate <FileName>;"). The error on the
validation set will now be logged as a 3rd column along with the
epoch number and training error.
See the ENV.5 and MMN.5 man pages for more information.
o MUME now catches more system signals when possible and exit
after saving the net states upon receiving them.
The signals are: SIGINT, SIGTERM, SIGXCPU, and SIGXFSZ.
o Logging output is more consistent under the control of the
"-verbose" switch.
o the following learning algorithms have been added:
stochastic error descent for limited precision training
reinforcement learning
conjugate gradient
simplex based methods
o the following net classes has been added:
resource allocation nets (class RAN)
NeTtalk postprocessing module (class N2K)
o the class RBPTT has been renamed WPANG.
o the WZ class (continuously running recurrent net) has now
what is called pins which have 0 propagation delays
o and of course, many bugs were fixed.
The behaviour of the learning algorithms have not changed between 0.5 and
0.6. All configuration files should still run under 0.6 except for
normalisation statements in the ENV class. We are sure that the new
statements make declarations much easier.
mume-request at sedal.su.OZ.AU
More information about the Connectionists
mailing list