Spatiotemporal integration of motion signals in MT neurons shapes perceptual decisions
Lucia Arancibia*1, Jacob L. Yates2, Alexander C. Huk3, 4, Klaus Wimmer1, Alexandre Hyafil1
1 Computational Neuroscience Group, Centre de Recerca Matemàtica, Barcelona, Spain2 Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA3 Fuster Laboratory, Departments of Psychiatry & Biobehavioral Sciences and Ophthalmology, UCLA, Los Angeles, CA, USA4 Center for Perceptual Systems, Institute for Neuroscience, Department of Neuroscience, Department of Psychology, The University of Texas at Austin, Austin, TX, USA
*Email: larancibia@crm.cat
IntroductionPerception requires integrating noisy dynamic visual information across the visual field to identify relevant stimuli and guide decisions. While temporal integration of sensory evidence has been studied extensively in experiments with highly controlled visual stimuli and reverse-correlation techniques [1], these studies often neglect the nonlinear mechanisms underlying spatial integration. Neurons in the Middle Temporal area (MT) show antagonistic motion-direction selectivity within their receptive fields (RFs) [2], which could mediate spatial suppression effects observed in motion discrimination [3]. Here we leverage jointly collected behavioral and neural activity in response to complex, spatially, and temporally varying stimuli, to characterize spatial effects in MT neurons and link them to the behavioral observations.Methods We analyzed a previously published dataset [4,5] of neural and behavioral responses from monkeys performing a motion discrimination task (Fig. 1a). Monkeys viewed Gabor fields with varying motion across space and time. We used logistic regression to model behavioral responses based on the spatiotemporal stimulus characteristics. In addition, we fit Poisson regression models to measure the impact of the spatiotemporal components of the motion field on the instantaneous firing rate on single MT units (Fig. 1d). Finally, a hierarchical model with an MT-like input layer and decision layer was used to simulate choices and compare them to monkeys.ResultsMonkeys integrate spatial evidence sublinearly due to (i) weaker impact of motion further away from the fovea, and (ii) surround suppression effects attenuating the responses to motion in the center of the stimulus (Fig. 1b-c). To investigate the neural basis of these effects, we used nonlinear regression and show that the instantaneous firing rates of MT neurons can be predicted from both excitatory and suppressive contributions of motion within the neurons' RFs (Fig. 1d-e). We link neural and behavioral findings in a hierarchical model of spatiotemporal decision making where spatial context modulates spatial stimulus integration in sensory neurons, and a decision area supports temporal integration to generate perception [6].DiscussionWe found that the spatial configuration of motion stimuli significantly shapes both sensory representations and perceptual decisions. Further, a decision model incorporating empirically derived MT tuning properties reproduced the observed behavioral spatial integration patterns, suggesting that spatial suppression between local moving elements at the sensory level contributes directly to global motion perception. Taken together, our results provide a deeper understanding of how the brain processes dynamic visual information, and how specific nonlinear properties of sensory perception in sensory neurons shape perceptual choices. This work is funded by the Spanish State Research Agency (PID2020-112838RB-100) and supported through the Severo Ochoa and María de Maeztu Program for Centers and Units of Excellence in R&D (CEX2020-001084-M). [1] Gold & Shadlen (2007). Annu Rev Neurosci. https://doi.org/10.1146/annurev.neuro.29.051605.113038
[2] Allman et al. (1985). Perception. https://doi.org/10.1068/p140105
[3] Tadin et al. (2003). Nature. https://doi.org/10.1038/nature01800
[4] Yates et al. (2017). Nat Neurosci. https://doi.org/10.1038/nn.4611
[5] Levi et al. (2023). J Neurosci. https://doi.org/10.1523/JNEUROSCI.0267-22.2023
[6] Wimmer et al. (2015). Nat Commun. https://doi.org/10.1038/ncomms7177