Bayesian nonparametric Principal Component ...
Document type :
Pré-publication ou Document de travail
Title :
Bayesian nonparametric Principal Component Analysis
Author(s) :
Elvira, Clément [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Chainais, Pierre [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Dobigeon, Nicolas [Auteur]
Signal et Communications [IRIT-SC]
Institut National Polytechnique (Toulouse) [Toulouse INP]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Chainais, Pierre [Auteur]
Centrale Lille
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Dobigeon, Nicolas [Auteur]
Signal et Communications [IRIT-SC]
Institut National Polytechnique (Toulouse) [Toulouse INP]
English keyword(s) :
Bayesian nonparametrics
dimension reduction
distribution on the Stiefel manifold
Indian buffet process
distribution on the Stiefel man- ifold
dimension reduction
distribution on the Stiefel manifold
Indian buffet process
distribution on the Stiefel man- ifold
HAL domain(s) :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
English abstract : [en]
Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. ...
Show more >Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. Only few works have proposed a probabilistic approach able to infer the number of significant components. To this purpose, this paper introduces a Bayesian nonparametric principal component analysis (BNP-PCA). The proposed model projects observations onto a random orthogonal basis which is assigned a prior distribution defined on the Stiefel manifold. The prior on factor scores involves an Indian buffet process to model the uncertainty related to the number of components. The parameters of interest as well as the nuisance parameters are finally inferred within a fully Bayesian framework via Monte Carlo sampling. A study of the (in-)consistence of the marginal maximum a posteriori estimator of the latent dimension is carried out. A new estimator of the subspace dimension is proposed. Moreover, for sake of statistical significance, a Kolmogorov-Smirnov test based on the posterior distribution of the principal components is used to refine this estimate. The behaviour of the algorithm is first studied on various synthetic examples. Finally, the proposed BNP dimension reduction approach is shown to be easily yet efficiently coupled with clustering or latent factor models within a unique framework.Show less >
Show more >Principal component analysis (PCA) is very popular to perform dimension reduction. The selection of the number of significant components is essential but often based on some practical heuristics depending on the application. Only few works have proposed a probabilistic approach able to infer the number of significant components. To this purpose, this paper introduces a Bayesian nonparametric principal component analysis (BNP-PCA). The proposed model projects observations onto a random orthogonal basis which is assigned a prior distribution defined on the Stiefel manifold. The prior on factor scores involves an Indian buffet process to model the uncertainty related to the number of components. The parameters of interest as well as the nuisance parameters are finally inferred within a fully Bayesian framework via Monte Carlo sampling. A study of the (in-)consistence of the marginal maximum a posteriori estimator of the latent dimension is carried out. A new estimator of the subspace dimension is proposed. Moreover, for sake of statistical significance, a Kolmogorov-Smirnov test based on the posterior distribution of the principal components is used to refine this estimate. The behaviour of the algorithm is first studied on various synthetic examples. Finally, the proposed BNP dimension reduction approach is shown to be easily yet efficiently coupled with clustering or latent factor models within a unique framework.Show less >
Language :
Anglais
Collections :
Source :
Files
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Open access
- Access the document
- http://arxiv.org/pdf/1709.05667
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01687236/document
- Open access
- Access the document
- document
- Open access
- Access the document
- manuscript_arxiv.pdf
- Open access
- Access the document
- 1709.05667
- Open access
- Access the document