Wednesday, December 05, 2007
Finite size neural networks
A paper by Hedi Soula and myself on the dynamics of a finite-size neural network (Soula and Chow, Neural Comp, 19:3262 (2007)) is out this month. You can also download it from my homepage. As you may infer, I've been preoccupied with finite-size effects lately. In this paper we consider a very simple Markov model of neuronal firing. We presume that the number of neurons that are active in a given time epoch depends only on the number that were active in the previous epoch. This would be valid in describing a network with recurrent excitation for example. Given this model we can calculate the equilibrium probability density function for a network of size N directly and from that all statistical quantities of interest. We also show that the model can describe the dynamics of a network of more biophysically plausible neuron models. The nice thing about our simple model is that we can then compare the exact results to mean field theory, which is the classical way of studying the dynamics of large numbers of coupled neurons. We show that the mean activity is generally well described by mean field theory, except near criticality as expected, but the variance is not. We also show that the network activity can possess a very long correlation time although the firing statistics of a single neuron does not.
Subscribe to:
Posts (Atom)