Decentralized learning with budgeted network ...
Type de document :
Communication dans un congrès avec actes
URL permanente :
Titre :
Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles
Auteur(s) :
Klein, John [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Albardan, Mahmoud [Auteur]
Guedj, Benjamin [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Colot, Olivier [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Albardan, Mahmoud [Auteur]
Guedj, Benjamin [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Colot, Olivier [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Titre de la manifestation scientifique :
ECML-PKDD, Decentralized Machine Learning at the Edge Workshop
Ville :
Wurzburg
Pays :
Allemagne
Date de début de la manifestation scientifique :
2019-09-16
Date de publication :
2019-09-16
Mot(s)-clé(s) :
Classifier ensemble
Decentralized learning
Copulas
Machine learning
Classifers ensemble
Decentralized learning
Copulas
Machine learning
Classifers ensemble
Discipline(s) HAL :
Physique [physics]/Physique [physics]/Analyse de données, Statistiques et Probabilités [physics.data-an]
Résumé en anglais : [en]
We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce ...
Lire la suite >We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach in which the shared data and the trained models are sent to a central machine that allows to build an ensemble of classifiers. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness as compared to gold standard approaches. A companion python implementation can be downloaded at https:\/\/github.com\/john-klein\/DELCOLire moins >
Lire la suite >We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach in which the shared data and the trained models are sent to a central machine that allows to build an ensemble of classifiers. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness as compared to gold standard approaches. A companion python implementation can be downloaded at https:\/\/github.com\/john-klein\/DELCOLire moins >
Langue :
Anglais
Audience :
Internationale
Vulgarisation :
Non
Établissement(s) :
CNRS
Centrale Lille
Université de Lille
Centrale Lille
Université de Lille
Date de dépôt :
2020-06-08T14:10:42Z
2020-06-09T08:54:35Z
2020-06-09T08:54:35Z
Fichiers
- documen
- Accès libre
- Accéder au document