1471-2202-10-S1-P165 1471-2202 Poster presentation <p>Parametric estimation of spike train statistics</p> Cessac Bruno Thierry.Vieville@sophia.inria.fr Viéville Thierry

LJAD, U. Of Nice-Sophia, France

NEUROMATHCOMP, INRIA Sophia-Antipolis Méditerranée, France

CORTEX, INRIA-LORIA, France

BMC Neuroscience <p>Eighteenth Annual Computational Neuroscience Meeting: CNS*2009</p> Don H Johnson Meeting abstracts – A single PDF containing all abstracts in this Supplement is available here. http://www.biomedcentral.com/content/pdf/1471-2202-10-S1-info.pdf <p>Eighteenth Annual Computational Neuroscience Meeting: CNS*2009</p> Berlin, Germany 18–23 July 2009 http://www.cnsorg.org/2009/ 1471-2202 2009 10 Suppl 1 P165 http://www.biomedcentral.com/1471-2202/10/S1/P165 10.1186/1471-2202-10-S1-P165
13 7 2009 2009 Cessac and Viéville; licensee BioMed Central Ltd.

Introduction

We consider the evolution of a network of neurons, focusing on the asymptotic behavior of spikes dynamics instead of membrane potential dynamics. The spike response is not sought as a deterministic response in this context, but as a conditional probability: "Reading the code" consists in inferring this probability 1. Since one has experimentally only access to finite time raster plots and since the convergence of the empirical statistics to their average can be quite slow, we use a parametric statistical model using a thermodynamic formalism. The natural candidate for spike train statistics is a Gibbs measure 2. Our work generalizes this seminal and profound work of Bialek and collaborators. This model allows us to predict the conditional probability of rank R Markovian spike patterns and is strongly linked with the thermodynamic formalism 3. It generalizes most spike patterns statistical models (e.g. Poisson, correlated Poisson, etc.).

Methods

A minimal instantiation of the formalism is reviewed, following 34, while a general algorithmic estimation method is proposed, minimizing the relative entropy, yielding fast convergent implementations. It is also made explicit how several spike observables (entropy, rate, synchronizations, correlations) are given in closed-form from the parametric estimation. This paradigm not only allows us to estimate the spike statistics, given a design choice, but also to compare different models, thus answering comparative questions about the neural code such as are correlations or time synchrony or a given set of spike patterns significant with respect to rate coding?

Results

A numerical validation of the method is proposed, in order to analyze the statistics of small groups (up to 8/12) of neurons, while the state of the art considers pairs only. The parametric statistical potential of Markov processes up to rank 16/20 is calculable, thus considering up to 220 states for the process. The method has been carefully calibrated with respect to standard processes such as Bernouilli processes. The implementation considers several well-established numerical methods, in order to be applicable to a large set of possible data. It is available as an open-source module in the http://enas.gforge.inria.fr middle-ware set. EnaS is a set of classes allowing to simulate and analyze so called "event neural assemblies." It is designed mainly as existing simulator plug-in (e.g. MVASpike or other simulators via the NeuralEnsemble meta-simulation platform) or as an add-on for computations with neural unit assembly on standard platforms. It is usable in C/C++, Java and Python.

Acknowledgements

Partially supported by the ANR MAPS & the MACCAC ARC projects.

Rieke F Warland D de Ruyter van Steveninck R Bialek W Spikes, Exploring the Neural Code MIT Press 1996 <p>Weak pairwise correlations imply string correlated network states in a neural population</p> Schneidman E Berry MJ Segev R Bialek W Nature 2006 440 1007 1012 1785327 16625187 10.1038/nature04701 <p>Relative entropy and identification of Gibbs measures in dynamical systems</p> Chazottes JR Floriani E Lima R J Statist Phys 1998 90 697 725 10.1023/A:1023220802597 <p>How Gibbs distribution may naturally arise from synaptic adaptation mechanisms</p> Cessac B Rostro-Gonzalez H Vasquez JC Viéville T Stat Phys