TDNN network configuration file(s) for PlaNet

Ben Bryant bdbryan at eng.clemson.edu
Wed Jun 3 20:21:33 EDT 1992


I recently sent a message concerning the above that was somehow
garbled in the transmission. I apologize for this. The file that
was sent in the last mailing was an ascii text file containing
our current "best estimate" of how the training of a TDNN takes
place implemented as a PlaNet network configuration file. If there
is anyone there who has experience with PlaNet and has written a
correct TDNN network config file for this package, I wonder if
you might be kind enough to send us a copy. If you cannot do
this for non-disclosure reasons, could you please simply look ove
the following implementation and tell me whether we have implemented
the training procedure correctly. I would be much obliged. The
following is our "best guess" TDNN:
####  file for 3-layer TDNN network with input 40x15 N=2; hidden 20x13
####  N=4 ; Output 3x9 

# DEFINITIONS OF DELAY

define NDin	3
define NDhid	5

define NDin_1	2
define NDhid_1	4

# DEFINITIONS OF UNITS
 
define NUin	40
define NUhid	20
define NUout	3

define NUin_1	39
define NUhid_1	19
define NUout_1	2

#DEFINITION OF INPUT FRAME

define NFin	15
define NFhid	(NFin-NDin+1)
define NFout	(NFin-NDin+2-NDhid)
 
define BiasHid	0
define BiasOut	0

## DEFINITIONS OF LAYERS

layer Input NFin*NUin
layer Hidden NUhid*NFhid
layer Output NFout*NUout
layer Result NUout
define biasd user1

## DEFINITIONS OF INPUT/TARGET BUFFERS

target NFout*NUout
input NFin*NUin

## DEFINITIONS OF CONNECTIONS

define Win (NUin*NDin_1+NUin_1)
define Whids 0
define Whid (NUhid_1)
connect InputHidden1 Input[0-Win] to Hidden[0-Whid]

define WHid (NUhid*NDhid_1+NUhid_1)
define Wout (NUout_1)
connect HiddenOutput1 Hidden[0-WHid] to Output[0-Wout]

## n.3layer.expr: implementation of a 3layer-feedforward-net with expressions.
## define Nin, Nhid, Nout, BiasHid and BiasOut as desired.

define ErrMsg \n\tread\swith\s'network\sNin=<no-of-input>\sNhid=<no-of-hidden>\sNout=<no-of-output>\sBiasHid=<bias-of-hidden>\sBiasOut=<bias-of-output>\sn.3layer.expr'\n

#IFNDEF NDin;  printf ErrMsg; exit; ENDIF
#IFNDEF Nhid; printf ErrMsg; exit; ENDIF
#IFNDEF Nout; printf ErrMsg; exit; ENDIF
IFNDEF BiasHid; printf ErrMsg; exit; ENDIF
IFNDEF BiasOut; printf ErrMsg; exit; ENDIF

# macro definitions of the derivarives of the sigmoid for Hidden and Output
IF $min==0&&$max==1
define HiddenDer Hidden*(1-Hidden)
define OutputDer Output*(1-Output)
ELSE
define HiddenDer (Hidden-$min)*($max-Hidden)/($max-$min)
define OutputDer (Output-$min)*($max-Output)/($max-$min)
ENDIF

## PROCEDURE FOR ACTIVATING NETWORK FORWARD

procedure activate
    scalar i
	i=0

    Input=$input

    while i<NFhid
    	Hidden:net[i*NUhid->i*NUhid+NUhid_1]=InputHidden1 \
		**T(Input[i*NUin->i*NUin+NUin*NDin_1+NUin_1])
     	i+=1
    endwhile

    Hidden = logistic(Hidden:net+(BiasHid*Hidden:bias))

    i=0
    while i<NFout 
  	Output:net[i*NUout->i*NUout+NUout_1] = HiddenOutput1 \
	       **T(Hidden[i*NUhid->i*NUhid+NUhid*NDhid_1+NUhid_1])
	i+=1
    endwhile

    Output=logistic(Output:net+(BiasOut*Output:bias))
    $Error=mean((Output:delta=$target-Output)^2)/2

    Output:delta*=OutputDer
end

## PROCEDURE FOR TRAINING NETWORK

matrix Hidden_delta NFout NDhid*NUhid

procedure learn
    call activate

    scalar i;scalar j	
        i=0 

    while i<NFout
        Hidden_delta[i]=Output:delta[i*NUout->i*NUout+NUout_1] \
               **HiddenOutput1*HiddenDer[i*NUhid->i*NUhid+NUhid*NDhid_1+NUhid_1]
	i+=1
    endwhile

    Hidden:delta=0
    i=0  

    while i<NFout
	j=0
	while j<NDhid
            Hidden:delta[(i+j)*NUhid->(i+j)*NUhid+NUhid_1] \
	    += Hidden_delta[i][j*NUhid->j*NUhid+NUhid_1]
	    j+=1
	endwhile
	i+=1
    endwhile

    i=0

    while i<NFhid
	if (i<NDhid ) then
	   Hidden:delta[i*NUhid->i*NUhid+NUhid_1]/=(i+1)
	endif
	if (NFhid-i<NDhid) then
	   Hidden:delta[i*NUhid->i*NUhid+NUhid_1]/=(NFhid-i)
	endif
	if ((NFhid-i>=NDhid) && (i>=NDhid)) then
	   Hidden:delta[i*NUhid->i*NUhid+NUhid_1]/=(NDhid)
	endif
	i+=1
    endwhile

    i = 0 
    InputHidden1:delta*=$alpha*(NDhid*(NFout)) 
    while i<NFout
	j=0
	while j<NDhid
            InputHidden1:delta \
	    += $eta*T(Hidden_delta[i][j*NUhid->j*NUhid+NUhid_1]) \
	       **Input[(i+j)*NUin->(i+j)*NUin+NDin_1*NUin+NUin_1]
	    j+=1 
	endwhile
	i+=1
    endwhile
	 
    InputHidden1 += InputHidden1:delta/=(NDhid*(NFout))

    i=0 
    HiddenOutput1:delta*=$alpha*(NFout)
    while i<NFout
        HiddenOutput1:delta+=$eta*T(Output:delta[i*NUout->i*NUout+NUout_1]) \
              **Hidden[i*NUhid->i*NUhid+NUhid*NDhid_1+NUhid_1] 
	i+=1
    endwhile

    HiddenOutput1:delta/=(NFout)
    HiddenOutput1+=HiddenOutput1:delta
    Hidden:bias+=Hidden:biasd=Hidden:delta*$eta+Hidden:biasd*$alpha
    Output:bias+=Output:biasd=Output:delta*$eta+Output:biasd*$alpha
end 

Thanks in advance for your help.
-Ben Bryant
<bdbryan at eng.clemson.edu>



More information about the Connectionists mailing list