Decentralized learning with budgeted network ...
Type de document :
Communication dans un congrès avec actes
Titre :
Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles
Auteur(s) :
Klein, John [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Albardan, Mahmoud [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Guedj, Benjamin [Auteur]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Inria-CWI [Inria-CWI]
MOdel for Data Analysis and Learning [MODAL]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Colot, Olivier [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]

Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Albardan, Mahmoud [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Guedj, Benjamin [Auteur]

Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Inria-CWI [Inria-CWI]
MOdel for Data Analysis and Learning [MODAL]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Colot, Olivier [Auteur]

Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Titre de la manifestation scientifique :
ECML-PKDD, Decentralized Machine Learning at the Edge Workshop
Ville :
Wurzburg
Pays :
Allemagne
Date de début de la manifestation scientifique :
2019-09-16
Mot(s)-clé(s) en anglais :
classifers ensemble
machine learning
copulas
Decentralized learning
classifier ensemble
machine learning
copulas
Decentralized learning
classifier ensemble
Discipline(s) HAL :
Physique [physics]/Physique [physics]/Analyse de données, Statistiques et Probabilités [physics.data-an]
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Calcul parallèle, distribué et partagé [cs.DC]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Calcul parallèle, distribué et partagé [cs.DC]
Informatique [cs]/Apprentissage [cs.LG]
Résumé en anglais : [en]
We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce ...
Lire la suite >We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach in which the shared data and the trained models are sent to a central machine that allows to build an ensemble of classifiers. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness as compared to gold standard approaches. A companion python implementation can be downloaded at https://github.com/john-klein/DELCOLire moins >
Lire la suite >We examine a network of learners which address the same classification task but must learn from different data sets. The learners can share a limited portion of their data sets so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach in which the shared data and the trained models are sent to a central machine that allows to build an ensemble of classifiers. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness as compared to gold standard approaches. A companion python implementation can be downloaded at https://github.com/john-klein/DELCOLire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- 1804.10028.pdf
- Accès libre
- Accéder au document
- 1804.10028
- Accès libre
- Accéder au document