An algorithmic framework for the optimization ...
Type de document :
Rapport de recherche
Titre :
An algorithmic framework for the optimization of deep neural networks architectures and hyperparameters
Auteur(s) :
Keisler, Julie [Auteur]
Université de Lille
EDF Labs
Talbi, El-Ghazali [Auteur]
Optimisation de grande taille et calcul large échelle [BONUS]
Université de Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Inria Lille - Nord Europe
Claudel, Sandra [Auteur]
EDF Labs
Cabriel, Gilles [Auteur]
Université de Lille
EDF Labs
Talbi, El-Ghazali [Auteur]
Optimisation de grande taille et calcul large échelle [BONUS]
Université de Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Inria Lille - Nord Europe
Claudel, Sandra [Auteur]
EDF Labs
Cabriel, Gilles [Auteur]
Éditeur :
arXiv
Institution :
Université de Lille
Date de publication :
2023
Mot(s)-clé(s) en anglais :
Neural and Evolutionary Computing (cs.NE)
Artificial Intelligence (cs.AI)
Machine Learning (cs.LG)
FOS: Computer and information sciences
Artificial Intelligence (cs.AI)
Machine Learning (cs.LG)
FOS: Computer and information sciences
Discipline(s) HAL :
Informatique [cs]
Résumé en anglais : [en]
In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), ...
Lire la suite >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Lire moins >
Lire la suite >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Lire moins >
Langue :
Anglais
Collections :
Source :
Fichiers
- 2303.12797
- Accès libre
- Accéder au document