The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

18
e Role of Constraints in Hebbian Learn Miller and MacKay 1994, Neural Comput: 101-126

description

The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126. Outline Constraints on Hebbian Plasticity: Importance Types of Constraints Dynamic Effects of Constraints Biological evidence for Constraints Function of Constraints in Heterogeneous Networks. - PowerPoint PPT Presentation

Transcript of The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Page 1: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

The Role of Constraints in Hebbian Learning

Miller and MacKay 1994, Neural Comput: 101-126

Page 2: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Outline

Constraints on Hebbian Plasticity:

• Importance

• Types of Constraints

• Dynamic Effects of Constraints

• Biological evidence for Constraints

• Function of Constraints in Heterogeneous Networks

Page 3: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Hebb’s Rule

τw dw/dt = vu = C·w (Correlation based learning rule)

where C = uu or the input correlation matrix

Problem with Hebb’s Rule: • weights grow without bounds => instability• loss of selectivity to different patterns of input

Page 4: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Solution: Introduce Competition

(constraint that limits total synaptic strength over cell)

• Multiplicative: synapse decays at rate proportional to its current strength

• Subtractive: synapse decays at a fixed rate

where n = (1,1,….1)T in the synaptic basis

Page 5: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Types of Constraints the two methods can enforce:

• Type 1 – Conserve total synaptic strength or

Hence,

Hyperplane Constraint Surface

Page 6: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

• Type 2 – Conserve sum-squared synaptic strength

Hence, M2: γ(w) = w · Cw/ w · w

Hypersphere Constraint Surface

Page 7: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Dynamic Effects of Multiplicative and Subtractive Constraints

+ve correlation in Hebb’s rule

(e0 is close in direction to constraint vector n)

Correlations oscillate in sign

(e0 || to constraint surface e0·n = 0

e0 treated as zero sum vector)

e0 always constrained surface

Cw α w Cw α n

Page 8: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Theorem 1: Under a multiplicatively enforced constraint, if principal eigen vector of C is an interior fixed point it is stable. Interior fixed points that are nonprincipal eigenvectors are unstable. (saturation unnecessary)

Theorem 2: Under an S1 constraint, if C has atleast two eigenvectors with positive eigenvalues, then any interior fixed point is unstable.

Theorem 3: If i and j are indices in the synaptic basis, and Cii > |Cij| then under S1 constraint, either all synapses (or all but one) are saturated in a stable final condition.S1 constraints lead to a zero sum vector that grows to complete saturation.

Page 9: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Outcomes of development with and without Constraints

RF has graded strengths RFs sharpened; Ocular dominance develops(final number of non-zero synapses α wtot/wmax)

Page 10: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Let w1 and w2 be the synaptic weight vector from each input projectionws = w1 + w2wd = w1 – w2

As inputs are symmetric, eigenvectors can be divided into sum eigenvectors:ws = eS

a, wd = 0 (eigenval λSa) and

wd = eDa, ws = 0 (eigenval λD

a)

Patterns of wd have zero total synaptic strength =>wd grows freely under S1; wd suppressed under M1 (unless eDa is the principal eigenvector i.e. λD

a > λSa – only possible with –ve correlations between inputs

from two eyes!)

S1 constraints more appropriate to model Ocular Dominance than M1 constraints

Page 11: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Constraints applied to a full layer of output cells – Heuristic Approach

Page 12: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Suggested Biological Implementation of Subtractive Constraints:

• Limited capacity of metabolic supply to synapses (constant decay imposed on each synapse)

• decay rate dependent on average degree of activation of cell

However, S1 punishes weak synapses more than M1

Page 13: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Evidence for another type of Constraint : Multiplicative Homeostatic Scaling in Cultured Networks

Turrigiano and Nelson, Nature Rev Neuro. 5: 97-107 (2004)

Page 14: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Turrigiano et al, Nature 391: 892-896 (1998)

___ multiplicative scaling….. random additive scaling----- additive scaling

Physiology of Homeostatic Scaling

Page 15: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Mechanisms of Homeostatic Scaling?

Single cell level

Network level

Page 16: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Homeostatic Scaling applied to Spatial Working Memory

Renart et al, Neuron 38: 473-485 (2004)

Homogeneous Cell Properties Heterogeneous Cell Properties

Page 17: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Isyn = gssgsyn(V-Vsyn)

Homeostatic Scaling allows robust Spatial Memory Encoding in Heterogeneous Networks

Page 18: The Role of Constraints in Hebbian Learning Miller and MacKay 1994, Neural Comput: 101-126

Summary

• Constraints are required to clamp uncontrolled growth of Hebbian plasticity and to maintain input selectivity

• Types of Constraints – S1, M1, M2

• Biological evidence for multiplicative scaling constraint

• Multiplicative homeostatic scaling allows robust encoding in heterogeneous networks