An algorithmic framework for the optimization ...
Document type :
Rapport de recherche
Title :
An algorithmic framework for the optimization of deep neural networks architectures and hyperparameters
Author(s) :
Keisler, Julie [Auteur]
Université de Lille
EDF Labs
Talbi, El-Ghazali [Auteur]
Optimisation de grande taille et calcul large échelle [BONUS]
Université de Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Inria Lille - Nord Europe
Claudel, Sandra [Auteur]
EDF Labs
Cabriel, Gilles [Auteur]
Université de Lille
EDF Labs
Talbi, El-Ghazali [Auteur]
Optimisation de grande taille et calcul large échelle [BONUS]
Université de Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Inria Lille - Nord Europe
Claudel, Sandra [Auteur]
EDF Labs
Cabriel, Gilles [Auteur]
Publisher :
arXiv
Institution :
Université de Lille
Publication date :
2023
English keyword(s) :
Neural and Evolutionary Computing (cs.NE)
Artificial Intelligence (cs.AI)
Machine Learning (cs.LG)
FOS: Computer and information sciences
Artificial Intelligence (cs.AI)
Machine Learning (cs.LG)
FOS: Computer and information sciences
HAL domain(s) :
Informatique [cs]
English abstract : [en]
In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), ...
Show more >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Show less >
Show more >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Show less >
Language :
Anglais
Collections :
Source :
Files
- 2303.12797
- Open access
- Access the document