Null-spaces in neural encoding and plasticity

Matthias Hennig, Michael Rule, Martino Sorbaro, Michael Deistler
School of Informatics, University of Edinburgh

Synaptic connectivity constrains the state space accessible to network
of neurons, and Hebbian learning plays an important role of shaping this
space in an experience- and behaviour-dependent manner. The role of a
single synapse, and its effect on the global dynamics, is however
difficult to assess. Here we use energy-based models of neural circuits
to address this question in two scenarios, the encoding of sensory
information and the storing of new memories in a recurrent network. In
both cases, analysis of the Fisher information Matrix of the system
reveals an anisotropic synaptic parameter space, with stiff directions
where small changes lead to large functional changes, and sloppy
directions that correspond to Null-spaces. We show that that Null spaces
can be exploited to suppress response variability in an encoding
scenario, and that they enable continual learning without catastrophic
forgetting. Based on these results, we suggest a new theory for neural
pruning during developmental, and derive a local Hebbian learning rule
that allows continual learning in a Hopfield attractor network.