Stochastic thermodynamics as a framework for modeling synaptic plasticity
Jan Karbowski*
Institute of Applied Mathematics and Mechanics, University of Warsaw, Warsaw, Poland
*Email: j.karbowski@uw.edu.pl
Introduction Synaptic plasticity is fundamental for learning and memory in the brain. Synapses and their molecular components are submicroscopic physical elements, which are prone to stochastic fluctuations in size and activity. How such stochastic systems can acquire and maintain information, i.e., how animals can keep memory for a long time?
Methods It is argued that the best method for describing synaptic plasticity is Stochastic Thermodynamics, which is a physical framework for dealing with information processing and associated energy costs in micro- and nano-scale [1,2]. Results Stochastic and Information Thermodynamics are applied to neural inference of outside signals, and to learning and memory in synaptic systems [2,3,4]. Information gains per energy are determined during LTP phase. Discussion The results show that acquiring 1 bit of information in a synapse is quite expensive. It costs about 105 kT, where k is the Boltzmann constant, and T= 300 K. Storing information is not that expensive.
The work was supported by NCN of Poland, grant 2021/41/B/ST3/04300.
[1] Peliti L, Pigolotti S (2021) Stochastic Thermodynamics: An introduction. Princeton Univ. Press. [2] Karbowski J (2024). Information thermodynamics: form physics to neuroscience. Entropy 26, 779. https://doi.org/10.3390/e26090779 [3] Karbowski J, Urban P (2024) Neural Comput. 36, 271-311. https://doi.org/10.1162/neco_a_01632 [4] Karbowski J (2021) J. Comput. Neurosci. 49, 71-106. https://doi.org/10.1007/s10827-020-00775-0