Loading…
Sunday July 6, 2025 17:20 - 19:20 CEST
P069 Statistics of spiking neural networks based on counting processes

Anne-Marie Greggs1*, Alexander Dimitrov1

1Department of Mathematics and Statistics, Washington State University, Vancouver, WA

*Email: anne-marie.greggs@wsu.edu
Introduction

Estimating neuronal network activity as point processes is challenging due to the singular nature of events and high signal dimensionality[1]. This project analyzes spiking neural networks (SNNs) using counting process statistics, which are equivalent integral representations of point processes[2]. A small SNN of Leaky Integrate-and-Fire (LIF) neurons is simulated, and spiking events are counted as a vector counting process N(t). The Poisson counting process has known dynamic statistics over time: both mean(t) and variance(t) are proportional to time (= r_i*t for each independent source with rate r_i). By standardizing the data, mean dynamics and heteroscedasticity can be removed, allowing comparison to a baseline Poisson counting process.
Methods
Using Brian2[3], an SNN with LIF neurons and Poisson inputs is simulated. Independent and correlated Poisson processes are modeled, generating spike trains for analysis. The counting process, a stochastic process producing the number of events within a time period, is analyzed using the vector counting process. Mean and covariance of spiking events are estimated for both SNN and Poisson processes, facilitating comparison of statistical properties after standardization by subtracting the mean and scaling by the standard deviation to account for temporal dependencies.
Results
Fig 1 shows the simulated spiking dynamics of two neurons over time. The standardized counts indicate variability aligned with Poisson statistical properties. While mean counts show a consistent trend, variance reflects the stochastic nature of neural activity. The centered plot's standard deviation equals the square root of rate and time. The standardized plot’s standard deviation equals 1, serving as a comparison template, starting at 200 milliseconds to avoid biases when rescaling initial small counts.
The covariance matrix quantifies relationships between neurons at certain time and activity levels. Comparing the SNN to modeled Poisson processes reveals notable differences in covariance structures, with the SNN demonstrating greater inter-unit correlation.

Discussion
This study establishes a framework for analyzing the statistical properties of neural network activity, enabling researchers to gain insights into the dynamics of spiking networks. Understanding these aspects is crucial for examining how neural networks respond to stimuli and adapt to changing environments.
The findings highlight the importance of inter-unit dependencies in neural data, with the proposed estimators effectively capturing these dynamics. Future research should broaden parameter exploration and apply the estimators to complex models and real-world data, including comparisons between inhomogeneous Poisson processes with time-varying rates, temporal dependencies, and non-Poisson processes of SNNs.




Figure 1. Two 3D plots compare the activity of Neuron 1 and Neuron 2 over time. The left plot shows counts centered based on the theoretical expectations, while the right plot shows counts standardized by both the theoretical expectations and theoretical standard deviations, with multiple lines representing each of 100 samples.
Acknowledgements
N/A
References
● Brown, E. N., Kass, R. E., & Mitra, P. P. (2004). Multiple neural spike train data analysis: state-of-the-art and future challenges. Nature Neuroscience, 7(5), 456-461. doi: 10.1038/nn1254
● Cox, D. R., & Isham, V. (1980). Point processes. Chapman and Hall.
● Stimberg M, Brette R, Goodman DFM (2019). Brian 2, an intuitive and efficient neural simulator. eLife 8:e47314. doi: 10.7554/eLife.47314



Sunday July 6, 2025 17:20 - 19:20 CEST
Passi Perduti

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link