Introduction to NETS

What NETS 3.0 is.

NETS 3.0 is a software simulator for a neural network (NN). A NN consists of layers of nodes connected by arcs. There is an input layer and an output layer. In between there are one (usually) or more 'hidden' layers. Signals pass along the connections from the input layer to the output layer. The size of the signal along each connection is controlled by a 'weight'. You train a associate output patterns with input patterns through a supervised learning process called backpropagation. During training the weights, which start off randomly, gradually are adjusted so that eventually the network produces the correct output for each input.

NETS 3.0 was developed by NASA.

Files used with Nets 3.0

The integer version of Nets 3.0 is nets_int.exe. The float version of Nets 3.0 is nets_flt.exe. You will probably want to use the latter unless training time becomes a problem. To generate stand alone executable programs using a trained network, you must recompile Nets 3.0 with your program (written in C). Therefore you must also have all the *.c and *.h files for Nets 3.0. Nets 3.0 can generate C code representing the trained network.

The neural network itself is defined in a configuration file with ending .net (see below.) You also must create a training file of input/output pairs representing the patterns to be learned and the outputs corresponding to these patterns. This file normally has the ending .iop. A test file is also normally needed. This is like the iop file but without the output pattern.

NETS 3.0 also saves a file containing the weights, presumably those representing the trained network. These can come in two flavours, an ascii file with ending .pwt, or a binary file which loads more quickly but is not portable between platforms.

 

Developing a NETS 3.0 system -- an overview.

Neural networks are pattern recognizers. A neural network correlates input patterns with output patterns. Before it can do this, the network must be trained. Therefore the programmer must provide a set of inputs, and their corresponding outputs. These are put in a file called the iop file according to a special format (see below.)

The programmer also must configure the network. The networks description is presented to Nets 3.0 using a small special language (see the *.net examples and more information below.)

Before training the programmer must also set the allowable error margin.

There are quite a few factors affecting training. A number of these are under the control of the programmer. However, the system is 'smart' enough to provide reasonable default values for most of these.

The programmer initiates training and, after the system achieves the required error margin, saves the weights corresponding to the trained network. Training can take a long time and is not guaranteed to succeed.

NETS 3.0 has a C code generator which allows the trained network to be used in any C program.

Creating a fully connected network.

A network is fully connected when every node in one layer connected to every node in the next layer. The provided xor.net file illustrates this kind of network. Below is the configuration file and below that is the correspond connections of the nodes.

LAYER : 0
NODES : 2
TARGET : 2
LAYER : 1
NODES : 1
LAYER : 2
NODES : 10
TARGET :1

The basic commands

As you can see from the example, you can describe a network with three basic commands: LAYER, NODES, and TARGET.

Note:

There must be a space on each side of the colon (:).

Use upper case

The input layer is always layer 0 and the output layer is always layer 1. These must be defined first. Thus the 'hidden' layers are numbered 2, 3, 4 ..

Comments

There can be comments in the network configuration file. These begin with two dashes --.

Learning rate and momentum

Good values for the learning rate and momentum can substantially improve the chances of the network being trained successfully. You can set the global values for these parameters from within the configuration file or using commands in the interpreter. (It is also possible to have different values for different layers.) In a configuration file you set the global values of these parameters by putting the commands such as the following ahead of other commands.

GLOBAL-LEARN-RATE : 2.5

GLOBAL-MOMENTUM : 0.9

(Make sure you have spaces around the colons!)

Training the network.

After you have created a network configuration file (.net), you next must create an input-output pairs file (.iop) from the training set of examples you have available.

The iop file.

The iop or training file consists of lists, in LISP format of floating point numbers, usually in the range 0.1 to 0.9 (nominally, 0 to 1). For the xor.net example, the xor.iop file might look like,

(.9 .9 .1)

(.9 .1 .9)

(.1 .9 .9)

(.1 .1 .1)

Note that this resembles a truth table for xor with 0s (F) and 1s (T) represented by floating point numbers. (For reasons of efficiency in training it is well to stay away from 1.0 and 0.0 although you often use 1 and 0 when entering testing examples.)

In the xor example, each row has 3 values in it. The first two are fed to the input nodes, and the last one represents the correct output expected from the lone output node. All iop entries for Nets 3.0 have this format. If you have a network with 1024 input nodes and 7 output nodes, then each list will have 1031 values, 1024 for the inputs, followed by 7 for the outputs.

Training.

Having created the net configuration file, say xor.net, and the iop file, say xor.iop you are at last ready to go. Boot Nets 3.0 by typing nets_flt at the prompt. (The program also runs nicely in a DOS window under MS Windows.) You will see a menu of available commands. There are quite a few:

NASA-JSC Software Technology Branch
NETS Back Propagation Simulator Version 3.0
b -- show bias values
c -- create a net
d -- set dribble parameters
e -- display error stats for a file
g -- generate delivery code
i -- setup I/O pairs
j -- reset I/O pairs
l -- change learning rate
m -- print this menu
n -- show net configuration
o -- reorder I/O pairs for training
p -- propagate an input through the net
r -- reset weights from a file
s -- save weights to a file
t -- teach the net
u -- scale I/O pairs
v -- scale a data file
w -- show weights between two layers
q -- quit program
NETS Choice(m = menu)?

You type 'c' to load the configuration file. You will be ask some questions, the number of which depends on what is in your configuration file. At first you should simply accept the proposed defaults.

Next type 'I' to load in the xor.iop file. You don't need to scale if the values are already between 0.0 and 1.0.

Then type 't' for training. When asked for the acceptable constrain error, a common reply is .2. Accept the 10000 value for the run, and then choose, maybe, 20 for the frequency of the output. Training starts. Training stops when the maximum error falls below .2, or 10000 cycles occur.

When the net is trained, save the weights with the 's' command. Choose the portable format to save the weights in.

Using the network.

You can use the network from the interpreter itself, or you can export C code which you can use to build stand alone applications.

From the interpreter

In this case there are normally two situations. The first situation is as follows. You have trained the network and you want to test it. If the number of nodes in the input layer is small, you can enter test values from the keyboard. You first type 'p' for propagate. Hit enter when asked for a file name. You will probably want to output to the screen for these tests so hit enter again. Now enter values for each input node. For xor you might enter 1 and 1. The output might be 0.18 which would be considered OK because it is close to 0 (and well away from 0.5 which kind of represents a 'don't know' outcome.)

The second situation involves doing some testing after you have previously exited Nets 3.0. In that case you load the network configuration file and then you must load the saved weights file. You do this with the reset command, 'r'. Then you can propagate test examples as before.

With a large number of input nodes, you will want to load test examples from a file. Create a test file using a word processor or a specially designed program if necessary. The format of the test file's contents is the same as for the iop file except that the values for the output nodes are left out. Then user the propagate command and enter the file name of your test file when prompted.

In stand alone applications.

After a network is trained and the corresponding weights saved, it is possible to invoke the Nets 3.0 code generator using the 'g' command. This command generates C code. For example, with xor.net you would get a file xor.c with all appropriate declarations and functions for accessing the neural network, which is stored as an array. The code is well commented. Try it out with the xor example.

A simple main function is provided which the user can modify and add other code to. When all this is done, the user code must be compiled together with all the Nets 3.0 code except the original Nets 3.0 main code which would clash with the user's main function. A few changes must also be made in the header file common.h. In particular, DELIVERY must be set to 1, and the integer mode flag should be set to 0.

Partially connected networks.

The default mode of operation of Nets 3.0 is to have the network fully connected. This arrangement can be inappropriate for large networks because of the enormous number of inter connections that is required. Furthermore, in modeling some situations such as human vision, a partially connected network seems more intuitive, more like the human retina. An example of a partially connected network may be found in the file alpha.net which can be used to recognize letters. (See also the files alpha, alpha.exe and alpha.iop.) Here is alpha.net.

GLOBAL-LEARN-RATE : 0.3
GLOBAL-MOMENTUM : 0.9
LAYER : 0
NODES : 120
X-DIMENSION : 10
Y-DIMENSION : 12
TARGET : 2
PATTERN-X-DIMENSION : 4
PATTERN-Y-DIMENSION : 3
X-OVERLAP : 1
LAYER : 1
NODES : 12
LAYER : 2
NODES : 12
X-DIMENSION : 3
Y-DIMENSION : 4
TARGET : 1

Note that the PATTERN-X-DIMENSION, PATTERN-Y-DIMENSION, X-OVERLAP, and Y-OVERLAP must be chosen to be consistent with the corresponding X-DIMENSION and Y-DIMENSION of the corresponding layers. Fortunately, NETS3.0 will usually warn you if you get it wrong.