Bayesian nonparametric Principal Component ...
Type de document :
Pré-publication ou Document de travail
Titre :
Bayesian nonparametric Principal Component Analysis
Auteur(s) :
Elvira, Clément [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Chainais, Pierre [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Dobigeon, Nicolas [Auteur]
Signal et Communications [IRIT-SC]
Institut National Polytechnique (Toulouse) [Toulouse INP]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Chainais, Pierre [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Dobigeon, Nicolas [Auteur]
Signal et Communications [IRIT-SC]
Institut National Polytechnique (Toulouse) [Toulouse INP]
Mot(s)-clé(s) en anglais :
Bayesian nonparametrics
dimension reduction
distribution on the Stiefel manifold
Indian buffet process
distribution on the Stiefel man- ifold
dimension reduction
distribution on the Stiefel manifold
Indian buffet process
distribution on the Stiefel man- ifold
Discipline(s) HAL :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Résumé en anglais : [en]
Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. ...
Lire la suite >Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. Only few works have proposed a probabilistic approach able to infer the number of significant components. To this purpose, this paper introduces a Bayesian nonparametric principal component analysis (BNP-PCA). The proposed model projects observations onto a random orthogonal basis which is assigned a prior distribution defined on the Stiefel manifold. The prior on factor scores involves an Indian buffet process to model the uncertainty related to the number of components. The parameters of interest as well as the nuisance parameters are finally inferred within a fully Bayesian framework via Monte Carlo sampling. A study of the (in-)consistence of the marginal maximum a posteriori estimator of the latent dimension is carried out. A new estimator of the subspace dimension is proposed. Moreover, for sake of statistical significance, a Kolmogorov-Smirnov test based on the posterior distribution of the principal components is used to refine this estimate. The behaviour of the algorithm is first studied on various synthetic examples. Finally, the proposed BNP dimension reduction approach is shown to be easily yet efficiently coupled with clustering or latent factor models within a unique framework.Lire moins >
Lire la suite >Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. Only few works have proposed a probabilistic approach able to infer the number of significant components. To this purpose, this paper introduces a Bayesian nonparametric principal component analysis (BNP-PCA). The proposed model projects observations onto a random orthogonal basis which is assigned a prior distribution defined on the Stiefel manifold. The prior on factor scores involves an Indian buffet process to model the uncertainty related to the number of components. The parameters of interest as well as the nuisance parameters are finally inferred within a fully Bayesian framework via Monte Carlo sampling. A study of the (in-)consistence of the marginal maximum a posteriori estimator of the latent dimension is carried out. A new estimator of the subspace dimension is proposed. Moreover, for sake of statistical significance, a Kolmogorov-Smirnov test based on the posterior distribution of the principal components is used to refine this estimate. The behaviour of the algorithm is first studied on various synthetic examples. Finally, the proposed BNP dimension reduction approach is shown to be easily yet efficiently coupled with clustering or latent factor models within a unique framework.Lire moins >
Langue :
Anglais
Collections :
Source :
Fichiers
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Accès libre
- Accéder au document
- http://arxiv.org/pdf/1709.05667
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- manuscript_arxiv.pdf
- Accès libre
- Accéder au document
- 1709.05667
- Accès libre
- Accéder au document