Next: Learning in Neural
Up: Neural Network Terminology
Previous: Sites
To compute the new activation values of the units, the SNNS simulator
running on a sequential workstation processor has to visit all of them
in some sequential order. This order is defined by the
Update Mode. Five update modes for general use are implemented in
SNNS. The first is a synchronous mode, all other are asynchronous,
i.e. in these modes units see the new outputs of their predecessors
if these have fired before them.
- synchronous: The units change their activation all together
after each step. To do this, the kernel first computes the new
activations of all units from their activation functions in some
arbitrary order. After all units have their new activation value
assigned, the new output of the units is computed. The outside
spectator gets the impression that all units have fired
simultaneously (in sync).
- random permutation: The units compute their new
activation and output function sequentially. The order is defined
randomly, but each unit is selected exactly once in every step.
- random: The order is defined by a random number
generator. Thus it is not guaranteed that all units are visited
exactly once in one update step, i.e. some units may be updated
several times, some not at all.
- serial: The order is defined by ascending internal unit
number. If units are created with ascending unit numbers from input to
output units, this is the fastest mode. Note that the use of serial
mode is not advisable if the units of a network are not in ascending
order.
- topological: The kernel sorts the units by their
topology. This order corresponds to the natural propagation of
activity from input to output. In pure feed-forward nets the input
activation reaches the output especially fast with this mode, because
many units already have their final output which doesn't change later.
Additionally, there are 12 more update modes for special network
topologies implemented in SNNS.
- CPN: For learning with counterpropagation.
- Time Delay: This mode takes into account the special
connections of time delay networks. Connections have to be updated in
the order in which they become valid in the course of time.
- ART1_Stable, ART2_Stable and ARTMAP_Stable:
Three update modes for the three adaptive resonance theory network
models. They propagate a pattern through the network until a stable
state has been reached.
- ART1_Synchronous, ART2_Synchronous and
ARTMAP_Synchronous: Three other update modes for the three adaptive
resonance theory network models. They perform just one propagation
step with each call.
- CC and RCC: Special update modes for the cascade
correlation and recurrent cascade correlation meta algorithms.
- BPTT: For recurrent networks, trained with
`backpropagation through time'.
- RM_Synchronous: Special update mode for auto-associative
memory networks.
Note, that all update modes only apply to the forward
propagation phase, the backward phase in learning procedures like
backpropagation is not affected at all.
Next: Learning in Neural
Up: Neural Network Terminology
Previous: Sites
Niels.Mache@informatik.uni-stuttgart.de
Tue Nov 28 10:30:44 MET 1995