next up previous contents index
Next: Quickprop Up: Backpropagation Networks Previous: Batch Backpropagation

Backpropagation with Weight Decay

      Weight Decay was introduced by P. Werbos ([Wer88]). It decreases the weights of the links while training them with backpropagation. In addition to each update of a weight by backpropagation, the weight is decreased by a part d of its old value. The resulting formula is

The effect is similar to the pruning algorithms (see chapter gif). Weights are driven to zero unless reinforced by backpropagation. For further information, see [Sch94].
Tue Nov 28 10:30:44 MET 1995