Eigenvalues in high dimensional principal component analysis
Eigenvalues in high dimensional (n<<p) principal component analysis.
hd.eigen(x, center = TRUE, scale = FALSE, k = NULL, vectors = FALSE)
x |
A numerical n \times p matrix with data where the rows are the observations and the columns are the variables. |
center |
Do you want your data centered? TRUE or FALSE. |
scale |
Do you want each of your variables scaled, i.e. to have unit variance? TRUE or FALSE. |
k |
If you want a specific number of eigenvalues and eigenvectors set it here, otherwise all eigenvalues (and eigenvectors if requested) will be returned. |
vectors |
Do you want the eigenvectors be returned? By dafault this is FALSE. |
When n<<p, at most the first n eigenvalues are non zero. Hence, there is no need to calculate the other p-n zero eigenvalues. When center is TRUE, the eigenvalues of the covariance matrix are calculated. When both the center and scale is TRUE the eigenvalues of the correlation matrix are calculated. One or more eigenvectors (towards the end) will be 0. In general the signs might be the opposite than R's, but this makes no difference.
A list including:
values |
A vector with the n (or first k) eigenvalues. The divisor in the crossproduc matrix is n-1 and not n. |
vectors |
A matrix of p \times n or p \times k eigenvectors. |
Michail Tsagris
R implementation and documentation: Michail Tsagris <mtsagris@yahoo.gr>.
x <- matrnorm( 40, 100) a <- hd.eigen(x, FALSE, FALSE) b <- prcomp(x, center = FALSE, scale = FALSE) a b$sdev^2 x <- NULL
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.