Hebbian Rule
Are a class of learning-rules which contain in some part the minimal hebbian rule.
Instability
The need for multiple Hebbian rules arises from the definition of the minimal-hebbian rule. This rule has no negative components so \( \Delta{w_{ij}} \) will always be positive and therefore always increasing for any two neurons that're firing together. This means the weights can quickly grow exponentially. The minimal-hebbian rule is [see page 27, unstable].