Faugeras

Neural networks do not become asynchronous in the large size limit when
synaptic weights are correlated: there is no propagation of chaos

Abstract:
We have developed a new method for establishing the thermodynamic limit
of a network of fully connected rate neurons with correlated, Gaussian
distributed, synaptic weights, and random inputs. The method is based on
the formulation of a large deviation principle (LDP) for the probability
distribution of the neuronal activity of a sequence of networks of
increasing sizes. The motivation for using random connections comes from
the fact that connections in neural networks are complex, poorly known
and heterogeneous. The motivation for introducing correlation is the
emphasis in computational modelling of neuroscience that neural networks
are modular, and the correlations in the connection distribution
reproduce this modularity, unlike in (MS02; BFT15). The limiting
probability law is Gaussian and its mean and covariance functions are
computed using a very quickly converging fixed point algorithm. Our
outstanding new result is the fact that, unlike in all previous works
(SCS88; MS02) in the thermodynamic limit the network does not become
asynchronous, there is no propagation of chaos: neurons remain
correlated and the amount of correlation can be described precisely from
the correlation between the synaptic weights.

[BFT15] M. Bossy, O. Faugeras, and D. Talay. Clarification and
complement to ”mean-field description and propagation of chaos in
networks of Hodgkin–Huxley and FitzHugh–Nagumo neurons”. The Journal of
Mathematical Neuroscience (JMN), 5(19), September 2015.

[MS02] O. Moynot and M. Samuelides. Large deviations and mean-field
theory for asymmetric random recurrent neural networks. Probability
Theory and Related Fields, 123(1):41–75, 2002.

[SCS88] H. Sompolinsky, A. Crisanti, and HJ Sommers. Chaos in Random
Neural Networks. Physical Review Letters, 61(3):259– 262, 1988.

Advertisements