The Role of Constraints in Hebbian Learning
Miller and MacKay 1994, Neural Comput: 101-126
Outline
Constraints on Hebbian Plasticity:
• Importance
• Types of Constraints
• Dynamic Effects of Constraints
• Biological evidence for Constraints
• Function of Constraints in Heterogeneous Networks
Hebb’s Rule
τw dw/dt = vu = C·w (Correlation based learning rule)
where C = uu or the input correlation matrix
Problem with Hebb’s Rule: • weights grow without bounds => instability• loss of selectivity to different patterns of input
Solution: Introduce Competition
(constraint that limits total synaptic strength over cell)
• Multiplicative: synapse decays at rate proportional to its current strength
• Subtractive: synapse decays at a fixed rate
where n = (1,1,….1)T in the synaptic basis
Types of Constraints the two methods can enforce:
• Type 1 – Conserve total synaptic strength or
Hence,
Hyperplane Constraint Surface
• Type 2 – Conserve sum-squared synaptic strength
Hence, M2: γ(w) = w · Cw/ w · w
Hypersphere Constraint Surface
Dynamic Effects of Multiplicative and Subtractive Constraints
+ve correlation in Hebb’s rule
(e0 is close in direction to constraint vector n)
Correlations oscillate in sign
(e0 || to constraint surface e0·n = 0
e0 treated as zero sum vector)
e0 always constrained surface
Cw α w Cw α n
Theorem 1: Under a multiplicatively enforced constraint, if principal eigen vector of C is an interior fixed point it is stable. Interior fixed points that are nonprincipal eigenvectors are unstable. (saturation unnecessary)
Theorem 2: Under an S1 constraint, if C has atleast two eigenvectors with positive eigenvalues, then any interior fixed point is unstable.
Theorem 3: If i and j are indices in the synaptic basis, and Cii > |Cij| then under S1 constraint, either all synapses (or all but one) are saturated in a stable final condition.S1 constraints lead to a zero sum vector that grows to complete saturation.
Outcomes of development with and without Constraints
RF has graded strengths RFs sharpened; Ocular dominance develops(final number of non-zero synapses α wtot/wmax)
Let w1 and w2 be the synaptic weight vector from each input projectionws = w1 + w2wd = w1 – w2
As inputs are symmetric, eigenvectors can be divided into sum eigenvectors:ws = eS
a, wd = 0 (eigenval λSa) and
wd = eDa, ws = 0 (eigenval λD
a)
Patterns of wd have zero total synaptic strength =>wd grows freely under S1; wd suppressed under M1 (unless eDa is the principal eigenvector i.e. λD
a > λSa – only possible with –ve correlations between inputs
from two eyes!)
S1 constraints more appropriate to model Ocular Dominance than M1 constraints
Constraints applied to a full layer of output cells – Heuristic Approach
Suggested Biological Implementation of Subtractive Constraints:
• Limited capacity of metabolic supply to synapses (constant decay imposed on each synapse)
• decay rate dependent on average degree of activation of cell
However, S1 punishes weak synapses more than M1
Evidence for another type of Constraint : Multiplicative Homeostatic Scaling in Cultured Networks
Turrigiano and Nelson, Nature Rev Neuro. 5: 97-107 (2004)
Turrigiano et al, Nature 391: 892-896 (1998)
___ multiplicative scaling….. random additive scaling----- additive scaling
Physiology of Homeostatic Scaling
Mechanisms of Homeostatic Scaling?
Single cell level
Network level
Homeostatic Scaling applied to Spatial Working Memory
Renart et al, Neuron 38: 473-485 (2004)
Homogeneous Cell Properties Heterogeneous Cell Properties
Isyn = gssgsyn(V-Vsyn)
Homeostatic Scaling allows robust Spatial Memory Encoding in Heterogeneous Networks
Summary
• Constraints are required to clamp uncontrolled growth of Hebbian plasticity and to maintain input selectivity
• Types of Constraints – S1, M1, M2
• Biological evidence for multiplicative scaling constraint
• Multiplicative homeostatic scaling allows robust encoding in heterogeneous networks
Top Related