Sorry, you need to enable JavaScript to visit this website.

Training Ultra-Low-Latency Spiking Neural Networks from Scratch

DOI:
10.60864/e9yx-zr05
Citation Author(s):
Gourav Datta, Zeyu Liu, Peter Beerel
Submitted by:
Peter Beerel
Last updated:
6 June 2024 - 10:21am
Document Type:
Presentation Slides
Document Year:
2024
Event:
Presenters:
Peter A. Beerel
Paper Code:
SS-L17
 

Spiking Neural networks (SNN) have emerged as an attractive spatio-temporal computing paradigm for a wide range of low-power vision tasks. However, state-of-the-art (SOTA) SNN models either incur multiple time steps which hinder their deployment in real-time use cases or increase the training complexity significantly. To mitigate this concern, we present a training framework (from scratch) for SNNs with ultra-low (down to 1) time steps that leverages the Hoyer regularizer. We calculate the threshold for each BANN layer as the Hoyer extremum of a clipped version of its activation map. The clipping value is determined through training using gradient descent with our Hoyer regularizer. We evaluate the efficacy of our training framework on large-scale vision tasks, including traditional and event-based image recognition and object detection. Our experiments demonstrate up to 34× increase in compute efficiency with a marginal accuracy/mAP drop compared to non-spiking networks. Finally, we implement our framework in the Lava-DL library, thereby enabling the deployment of our SNN models in the Loihi neuromorphic chip.

up
0 users have voted: