Linking connectivity and dynamics in low-rank recurrent neural networks
Synaptic connectivity determines the dynamics and computations performed by cortical neural networks. Due to the highly recurrent nature of their circuitry, the relationship between connectivity, dynamics and computations is complex, and understanding it requires theoretical models. Classical models of recurrent networks are based on connectivity that is either fully random or highly structured, e.g. clustered. Experimental measurements in contrast show that cortical connectivity lies somewhere between these two extremes. Moreover, a number of functional approaches suggest that a minimal amount of structure in the connectivity is sufficient to implement a large range of computations.
Based on these observations, we developed a theory of recurrent networks with a connectivity consisting of a combination of a random part and a minimal, low-dimensional structure. We showed that in such networks, the dynamics are low-dimensional and can be directly inferred from connectivity using a geometrical approach. We exploited this understanding to determine minimal connectivity structures required to implement specific computations. We found that the dynamical range and computational capacity of a network quickly increases with the dimensionality of the structure in the connectivity. Our simplified theoretical framework captures and connects a number of outstanding experimental observations, in particular the fact that neural representations are high-dimensional and distributed, while dynamics are low-dimensional, with a dimensionality that increases with task complexity.