Dimensionality Reduction on Spatio-Temporal Maximum Entropy Models of Spiking Networks
Abstract
Maximum entropy models (MEM) have been widely used in the last 10 years to characterize the 16 statistics of networks of spiking neurons. A major drawback of this approach is that the number of parameters used in the statistical model increases very fast with the network size, hindering its interpretation and fast computation. Here, we present a novel framework of dimensionality reduction for generalized MEM handling spatio-temporal correlations. This formalism is based on information geometry where a MEM is a point on a large-dimensional manifold. We exploit the geometrical properties of this manifold in order to find a projection on a lower dimensional space that best captures the high-order statistics. This allows us to define a quantitative criterion that we call the "degree of compressibility" of the neuronal code. A powerful aspect of this method is that it does not require fitting the model. Indeed, the matrix defining the metric of the manifold is computed directly via the data without parameters fitting. The method is first validated using synthetic data generated by a known statistics. We then analyze a MEM having more parameters than the underlying data statistics and show that our method detects the extra dimensions. We then test it on experimental retinal data. We record retinal ganglion cells (RGC) spiking data using multi-electrode arrays (MEA) under different visual stimuli: spontaneous activity, white noise stimulus, and natural scene. Using our method, we report a dimensionality reduction up to 50% for retinal data. As we show, this is quite a huge reduction compared to a randomly generated spike train, suggesting that the neuronal code, in these experiments, is highly compressible. This additionally shows that the dimensionality reduction depends on the stimuli statistics, supporting the idea that sensory networks adapt to stimuli statistics by modifying the level of redundancy.
Origin : Files produced by the author(s)
Loading...