P142 Finite-sampling bias correction for discrete Partial Information Decomposition
Loren Koçillari*1,2, Gabriel M. Lorenz1,4, Nicola M. Engel1, Marco Celotto1,5, Sebastiano Curreli3, Simone B. Malerba1, Andreas K. Engel2, Tommaso Fellin3, and Stefano Panzeri1
1Institute for Neural Information Processing, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf (UKE), Hamburg, Germany
2Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf (UKE), Hamburg, Germany
3Istituto Italiano di Tecnologia, Genova, Italy
4Department of Pharmacy and Biotechnology, University of Bologna, Bologna, Italy
5Department of Brain and Cognitive Sciences, Picower Institute for Learning and Memory, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA
*Email: l.kocillari@uke.de
Introduction
A major question in neuroscience is how groups of neurons interact to generate behavior. Shannon Information Theory has been widely used to quantify dependencies among neural units and cognitive variables [1]. Partial Information Decomposition (PID) [2,3] has extended Shannon theory to decompose neural information into synergy, redundancy, and unique information. Discrete versions of PID are suitable for spike train analysis. However, estimating information measures from real data is subject to a systematic upward bias due to limited sampling [4], an issue that has been largely overlooked in PID analyses of neural data.
Methods
Here, we first studied the bias of discrete PID through simulations of neuron pairs with varying degrees of synergy and redundancy, using sums of Poisson processes with individual and shared terms modulated by stimuli. We assumed that the bias of union information (the sum of unique information and redundancy) equals that of the information obtained from stimulus-uncorrelated neurons. We found that this assumption accurately matched simulated data, allowing us to derive analytical approximations of PID biases in large sample sizes. We used this knowledge to develop efficient bias-correction methods, validating them on empirical recordings from 53,113 neuron pairs in the auditory cortex, posterior parietal cortex, and hippocampus of mice.
Results
Our results show that limited sampling bias affects all terms in discrete PIDs, with synergy exhibiting the largest upward bias. The bias of synergy grows quadratically with the number of possible discrete responses of individual neurons, whereas the bias of unique information scales linearly and has intermediate values, while redundancy remains almost unbiased. Thus, neglecting or failing to correct for this bias leads to substantially inflated synergy estimates. Simulations and real data analyses showed that our bias-correction procedures can mitigate this problem, leading to much more precise estimates of all PID components.
Discussion
Our study highlights the systematic overestimation of synergy in both simulated and empirical datasets, underscoring the need for bias-correction methods, and offers empirically validated ways to correct for this problem. These findings provide a computational and theoretical basis for enhancing the reliability of PID analyses in neuroscience and related fields. Our work informs experimental design by providing guidelines on the sample sizes required for unbiased PID estimates and supports computational neuroscientists in selecting efficient PID bias-correction methods.
Acknowledgements
This work was supported by the NIH Brain Initiative grant U19 NS107464 (to SP and TF), the NIH Brain Initiative grant R01 NS109961 and R01 NS108410, the Simons Foundation for Autism Research Initiative (SFARI) grant 982347 (to SP), the European Union’s European Research Council grants NEUROPATTERNS 647725 (to TF) and cICMs ERC-2022-AdG-101097402 (to AKE).
References
1.Quian Quiroga, R, Panzeri, S (2009). Extracting information from neuronal populations: information theory and decoding approaches.Nature Reviews Neuroscience, 10, 173-185.
2.Williams, PL, Beer, RD (2010). Nonnegative decomposition of multivariate information.arXiv preprintarXiv:1004.2515.
3.Bertschinger, N, Rauh, J, Olbrich, E, Jost, J, Ay, N. (2014). Quantifying unique information.Entropy, 16, 2161-2183.
4.Panzeri, S, Treves, A (1996). Analytical estimates of limited sampling biases in different information measures.Network: Computation in neural systems, 7, 87-107.