Loading…
Sunday July 6, 2025 17:20 - 19:20 CEST
P098 DelGrad: Exact event-based gradients in spiking networks for training delays and weights

Julian Göltz*+1,2, Jimmy Weber*3, Laura Kriener*3,2,
Sebastian Billaudelle3,1, Peter Lake1, Johannes Schemmel1,
Melika Payvand$3, Mihai A. Petrovici$2

1Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
2Department of Physiology, University of Bern, Bern, Switzerland
3Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland

*Shared first authorship$shared senior authorship
+Email: julian.goeltz@kip.uni-heidelberg.de
Introduction
Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Incorporating trainable transmission delays, alongside synaptic weights, is crucial for shaping these temporal dynamics. While recent methods have shown the benefits of training delays and weights in terms of accuracy and memory efficiency, they rely on discrete time, approximate gradients, and full access to internal variables like membrane potentials [1]. This limits their precision, efficiency, and suitability for neuromorphic hardware due to increased memory requirements and I/O bandwidth demands.



Methods
To alleviate these issues, building on prior work on exact gradients in SNNs [2] we propose an analytical approach for calculating exact gradients of the loss with respect to both synaptic weights and delays in an event-based fashion. The inclusion of delays emerges naturally within our proposed formalism, enriching the model’s parameter search space with a temporal dimension (Fig. 1a). Our algorithm is purely based on the timing of individual spikes and does not require access to other variables such as membrane potentials.


Results
We investigate the impact of delays on accuracy and parameter efficiency both in ideal and hardware-aware simulations on the Yin-Yang classification task [3]. Furthermore, while previous work on learnable delays in SNNs has been mostly confined to software simulations, we demonstrate the functionality and benefits of our approach on the BrainScaleS-2 neuromorphic platform [4, Fig. 1b], successfully training on-chip-delays, and showing a good correspondence to our hardware-aware simulations (Fig. 1c,d).




Discussion
DelGrad presents an event-based framework for gradient-based co-training of weight and delay parameters, without any approximations. For the first time, we experimentally demonstrate the memory efficiency and accuracy benefits of adding delays to SNNs on noisy mixed-signal hardware. Additionally, these experiments also reveal the potential of delays for stabilizing networks against noise. DelGrad
opens a new way for training SNNs with delays on neuromorphic hardware, which results in fewer required parameters, higher accuracy and ease of hardware training.




Figure 1. a Information flow in an SNN, effect of weights w and delays d on the membrane potential of a neuron, and raster plot of the activity. b Photo of the neuromorphic chip BrainScaleS-2. c Comparison of networks without (blue) and with (orange) delays, showing the benefit of delays. d Our hardware-aware simulation can be used effectively as a proxy for hardware emulation, and confirms these benefits.
AcknowledgementsThis work was funded by the Manfred Stärk Foundation, the EC Horizon 2020 Framework Programme under grant agreement 945539 (HBP) and Horizon Europe grant agreement 101147319 (EBRAINS 2.0), the DFG under Germany’s Excellence Strategy EXC 2181/1-390900948 (STRUCTURES Excellence Cluster), SNSF Starting Grant Project UNITE (TMSGI2-211461), and the VolkswagenStiftung under grant number 9C840.
References
[1] I. Hammouamri, et al. doi: 10.48550/arXiv.2306.17670.
[2] J. Göltz, et al. doi: 10.1038/s42256-021-00388-x.
[3] L. Kriener, et al. doi: 10.1145/3517343.3517380.
[4] C. Pehle, et al. doi: 10.3389/fnins.2022.795876.


Sunday July 6, 2025 17:20 - 19:20 CEST
Passi Perduti

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link