Progressive Layer-based Compression for ...
Document type :
Pré-publication ou Document de travail
Title :
Progressive Layer-based Compression for Convolutional Spiking Neural Network
Author(s) :
Elbez, Hammouda [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Fatahi, Mazdak [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Fatahi, Mazdak [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
English keyword(s) :
Spiking Neural Network
Neuromorphic Computing
Compression
STDP
SpiNNaker
Surrogate Gradient
Neuromorphic Computing
Compression
STDP
SpiNNaker
Surrogate Gradient
HAL domain(s) :
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Apprentissage [cs.LG]
English abstract : [en]
Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks ...
Show more >Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks on neuromorphic hardware for extra-low energy consumption is another exciting aspect of this technology. Furthermore, many works discuss the improvement of SNNs in terms of performance and hardware implementation. This paper presents a progressive layer-based compression approach applied to convolutional spiking neural networks trained either with Spike Time Dependent Plasticity (STDP) or Surrogate Gradient (SG). Moreover, we study the effect of this approach when used with SpiNNaker. This approach, inspired by neuroplasticity, produces highly compressed networks (up to 90% compression rate per layer) while preserving most of the network performance, as shown by experimental results on MNIST, FMNIST, Caltech face/motorbike, and CIFAR-10 datasets.Show less >
Show more >Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks on neuromorphic hardware for extra-low energy consumption is another exciting aspect of this technology. Furthermore, many works discuss the improvement of SNNs in terms of performance and hardware implementation. This paper presents a progressive layer-based compression approach applied to convolutional spiking neural networks trained either with Spike Time Dependent Plasticity (STDP) or Surrogate Gradient (SG). Moreover, we study the effect of this approach when used with SpiNNaker. This approach, inspired by neuroplasticity, produces highly compressed networks (up to 90% compression rate per layer) while preserving most of the network performance, as shown by experimental results on MNIST, FMNIST, Caltech face/motorbike, and CIFAR-10 datasets.Show less >
Language :
Anglais
Collections :
Source :
Files
- document
- Open access
- Access the document
- Progressive%20Layer-based%20Compression%20for%20Convolutional%20Spiking%20Neural%20Network.pdf
- Open access
- Access the document
- frontiers_SupplementaryMaterial.pdf
- Open access
- Access the document