Progressive Layer-based Compression for ...
Type de document :
Pré-publication ou Document de travail
Titre :
Progressive Layer-based Compression for Convolutional Spiking Neural Network
Auteur(s) :
Elbez, Hammouda [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Fatahi, Mazdak [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Fatahi, Mazdak [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Mot(s)-clé(s) en anglais :
Spiking Neural Network
Neuromorphic Computing
Compression
STDP
SpiNNaker
Surrogate Gradient
Neuromorphic Computing
Compression
STDP
SpiNNaker
Surrogate Gradient
Discipline(s) HAL :
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Apprentissage [cs.LG]
Résumé en anglais : [en]
Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks ...
Lire la suite >Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks on neuromorphic hardware for extra-low energy consumption is another exciting aspect of this technology. Furthermore, many works discuss the improvement of SNNs in terms of performance and hardware implementation. This paper presents a progressive layer-based compression approach applied to convolutional spiking neural networks trained either with Spike Time Dependent Plasticity (STDP) or Surrogate Gradient (SG). Moreover, we study the effect of this approach when used with SpiNNaker. This approach, inspired by neuroplasticity, produces highly compressed networks (up to 90% compression rate per layer) while preserving most of the network performance, as shown by experimental results on MNIST, FMNIST, Caltech face/motorbike, and CIFAR-10 datasets.Lire moins >
Lire la suite >Spiking neural networks (SNNs) have attracted interest in recent years due to their low energy consumption and the increasing need for more power in real-life machine learning applications. Having those bio-inspired networks on neuromorphic hardware for extra-low energy consumption is another exciting aspect of this technology. Furthermore, many works discuss the improvement of SNNs in terms of performance and hardware implementation. This paper presents a progressive layer-based compression approach applied to convolutional spiking neural networks trained either with Spike Time Dependent Plasticity (STDP) or Surrogate Gradient (SG). Moreover, we study the effect of this approach when used with SpiNNaker. This approach, inspired by neuroplasticity, produces highly compressed networks (up to 90% compression rate per layer) while preserving most of the network performance, as shown by experimental results on MNIST, FMNIST, Caltech face/motorbike, and CIFAR-10 datasets.Lire moins >
Langue :
Anglais
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- Progressive%20Layer-based%20Compression%20for%20Convolutional%20Spiking%20Neural%20Network.pdf
- Accès libre
- Accéder au document
- frontiers_SupplementaryMaterial.pdf
- Accès libre
- Accéder au document