BCM Rule
A Hebbian Rule which [see page 3, uses] a non-monotonic transformation function \( \Phi \) and a sliding threshold value (any output below this leads to a negative \( \Delta{w_{ij}} \)) to induce homeostasis. \[ \Delta{w_{ij}} = \alpha \Phi(v_i^{\text{post} - \phi}) v_j^{\text{pre}} \] Where \( \theta = f(\langle v_i^{\text{post}} \rangle) \) varies depending on the average post-synaptic firing rate. If the weights change too much the threshold is increased to minimise (or decrease) the increase in weights, and vice versa.
Selectivity and Competition
Consider using BCM on an [see page 7, unselective] neural network. Each output neuron detects different correlations in the input neuron (acclimating to certain clusters of correlation).
BCM [see page 13, introduces] competition, causing output neurons to specialise in different correlated input patterns, similar to the complex receptive fields in the visual cortex.