Escaping the Curse of Dimensionality in ...
Document type :
Compte-rendu et recension critique d'ouvrage
Title :
Escaping the Curse of Dimensionality in Similarity Learning: Efficient Frank-Wolfe Algorithm and Generalization Bounds
Author(s) :
Liu, Kuan [Auteur]
Google Inc.
Bellet, Aurelien [Auteur]
Machine Learning in Information Networks [MAGNET]
Google Inc.
Bellet, Aurelien [Auteur]
Machine Learning in Information Networks [MAGNET]
Journal title :
Neurocomputing
Pages :
185-199
Publisher :
Elsevier
Publication date :
2019
ISSN :
0925-2312
English keyword(s) :
Metric learning
Frank-Wolfe algorithm
Generalization bounds
Frank-Wolfe algorithm
Generalization bounds
HAL domain(s) :
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
Similarity and metric learning provides a principled approach to construct a task-specific similarity from weakly supervised data. However, these methods are subject to the curse of dimensionality: as the number of features ...
Show more >Similarity and metric learning provides a principled approach to construct a task-specific similarity from weakly supervised data. However, these methods are subject to the curse of dimensionality: as the number of features grows large, poor generalization is to be expected and training becomes intractable due to high computational and memory costs. In this paper, we propose a similarity learning method that can efficiently deal with high-dimensional sparse data. This is achieved through a parameterization of similarity functions by convex combinations of sparse rank-one matrices, together with the use of a greedy approximate Frank-Wolfe algorithm which provides an efficient way to control the number of active features. We show that the convergence rate of the algorithm, as well as its time and memory complexity, are independent of the data dimension. We further provide a theoretical justification of our modeling choices through an analysis of the generalization error, which depends logarithmically on the sparsity of the solution rather than on the number of features. Our experiments on datasets with up to one million features demonstrate the ability of our approach to generalize well despite the high dimensionality as well as its superiority compared to several competing methods.Show less >
Show more >Similarity and metric learning provides a principled approach to construct a task-specific similarity from weakly supervised data. However, these methods are subject to the curse of dimensionality: as the number of features grows large, poor generalization is to be expected and training becomes intractable due to high computational and memory costs. In this paper, we propose a similarity learning method that can efficiently deal with high-dimensional sparse data. This is achieved through a parameterization of similarity functions by convex combinations of sparse rank-one matrices, together with the use of a greedy approximate Frank-Wolfe algorithm which provides an efficient way to control the number of active features. We show that the convergence rate of the algorithm, as well as its time and memory complexity, are independent of the data dimension. We further provide a theoretical justification of our modeling choices through an analysis of the generalization error, which depends logarithmically on the sparsity of the solution rather than on the number of features. Our experiments on datasets with up to one million features demonstrate the ability of our approach to generalize well despite the high dimensionality as well as its superiority compared to several competing methods.Show less >
Language :
Anglais
Popular science :
Non
Collections :
Source :
Files
- https://hal.inria.fr/hal-02166425/document
- Open access
- Access the document
- https://hal.inria.fr/hal-02166425/document
- Open access
- Access the document
- http://arxiv.org/pdf/1807.07789
- Open access
- Access the document
- https://hal.inria.fr/hal-02166425/document
- Open access
- Access the document
- https://hal.inria.fr/hal-02166425/document
- Open access
- Access the document
- document
- Open access
- Access the document
- neurocomp19.pdf
- Open access
- Access the document
- 1807.07789
- Open access
- Access the document