Applications of common principal components in multivariate and high-dimensional analysis

Abstract: This thesis consists of four papers, all exploring some aspect of common principal component analysis (CPCA), the generalization of PCA to multiple groups. The basic assumption of the CPC model is that the space spanned by the eigenvectors is identical across several groups, whereas eigenvalues associated with the eigenvectors can vary. CPCA is used in essentially the same areas and applications as PCA.The first paper compares the performance of the maximum likelihood and Krzanowski’s estimators of the CPC model for two real-world datasets and in a Monte Carlo simulation study. The simplicity and intuition of Krzanowski's estimator and the findings in this paper support and promote the use of this estimator for CPC models over the maximum likelihood estimator.Paper number two uses CPCA as a tool for imposing restrictions on system-wise regression models. The paper contributes to the field by proposing a variety of explicit estimators, deriving their properties and identifying the appropriate amount of smoothing that should be imposed on the estimator. In the third paper, a generalization of the fixed effects PCA model to multiple populations in a CPC environment is proposed. The model includes mainly geometrical, rather than probabilistic, assumptions, and is designed to account for any possible prior information about the noise in the data to yield better estimates, obtained by minimizing a least squares criterion with respect to a specified metric.The fourth paper survey some properties of the orthogonal group and the associated Haar measure on it. It is demonstrated how seemingly abstract results contribute to applied statistics and, in particular, to PCA.

  CLICK HERE TO DOWNLOAD THE WHOLE DISSERTATION. (in PDF format)