An algorithmic framework for the optimization ...
Document type :
Pré-publication ou Document de travail
Title :
An algorithmic framework for the optimization of deep neural networks architectures and hyperparameters
Author(s) :
Keisler, Julie [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Talbi, El-Ghazali [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Claudel, Sandra [Auteur]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Cabriel, Gilles [Auteur]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Talbi, El-Ghazali [Auteur]

Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Claudel, Sandra [Auteur]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Cabriel, Gilles [Auteur]
EDF R&D [EDF R&D]
Optimisation, Simulation, Risque et Statistiques pour les Marchés de l’Energie [EDF R&D OSIRIS]
Publication date :
2023-02-22
English keyword(s) :
Metaheuristics
Evolutionary Algorithm
AutoML
Neural Architecture Search
Hyperparameter optimization
Directed Acyclic Graphs
Time Series Forecasting
Evolutionary Algorithm
AutoML
Neural Architecture Search
Hyperparameter optimization
Directed Acyclic Graphs
Time Series Forecasting
HAL domain(s) :
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Réseau de neurones [cs.NE]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Réseau de neurones [cs.NE]
English abstract : [en]
In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), ...
Show more >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Show less >
Show more >In this paper, we propose an algorithmic framework to automatically generate efficient deep neural networks and optimize their associated hyperparameters. The framework is based on evolving directed acyclic graphs (DAGs), defining a more flexible search space than the existing ones in the literature. It allows mixtures of different classical operations: convolutions, recurrences and dense layers, but also more newfangled operations such as self-attention. Based on this search space we propose neighbourhood and evolution search operators to optimize both the architecture and hyper-parameters of our networks. These search operators can be used with any metaheuristic capable of handling mixed search spaces. We tested our algorithmic framework with an evolutionary algorithm on a time series prediction benchmark. The results demonstrate that our framework was able to find models outperforming the established baseline on numerous datasets.Show less >
Language :
Anglais
Collections :
Source :
Files
- document
- Open access
- Access the document
- sample.pdf
- Open access
- Access the document
- 2303.12797
- Open access
- Access the document