Functional principal component analysis by a two-stage method
This function performs functional PCA by performing an ordinary singular value decomposition on the functional data matrix, then smoothing the right singular vectors by smoothing splines.
fpca2s( Y = NULL, ydata = NULL, argvals = NULL, npc = NA, center = TRUE, smooth = TRUE )
Y |
data matrix (rows: observations; columns: grid of eval. points) |
ydata |
a data frame |
argvals |
the argument values of the function evaluations in |
npc |
how many smooth SVs to try to extract, if |
center |
center |
smooth |
logical; defaults to TRUE, if NULL, no smoothing of eigenvectors. |
Note that fpca2s
computes smoothed orthonormal eigenvectors
of the supplied function evaluations (and associated scores), not (!)
evaluations of the smoothed orthonormal eigenfunctions. The smoothed
orthonormal eigenvectors are then rescaled by the length of the domain
defined by argvals
to have a quadratic integral approximately equal
to one (instead of crossproduct equal to one), so they approximate the behavior
of smooth eigenfunctions. If argvals
is not equidistant,
fpca2s
will simply return the smoothed eigenvectors without rescaling,
with a warning.
an fpca
object like that returned from fpca.sc
,
with entries Yhat
, the smoothed trajectories, Y
, the observed
data, scores
, the estimated FPC loadings, mu
, the column means
of Y
(or a vector of zeroes if !center
), efunctions
,
the estimated smooth FPCs (note that these are orthonormal vectors, not
evaluations of orthonormal functions if argvals
is not equidistant),
evalues
, their associated eigenvalues, and npc
, the number of
smooth components that were extracted.
Luo Xiao lxiao@jhsph.edu, Fabian Scheipl
Xiao, L., Ruppert, D., Zipunnikov, V., and Crainiceanu, C., (2013), Fast covariance estimation for high-dimensional functional data. (submitted) https://arxiv.org/abs/1306.5718.
Gavish, M., and Donoho, D. L. (2014). The optimal hard threshold for singular values is 4/sqrt(3). IEEE Transactions on Information Theory, 60(8), 5040–5053.
#### settings I <- 50 # number of subjects J <- 3000 # dimension of the data t <- (1:J)/J # a regular grid on [0,1] N <- 4 #number of eigenfunctions sigma <- 2 ##standard deviation of random noises lambdaTrue <- c(1,0.5,0.5^2,0.5^3) # True eigenvalues case = 1 ### True Eigenfunctions if(case==1) phi <- sqrt(2)*cbind(sin(2*pi*t),cos(2*pi*t), sin(4*pi*t),cos(4*pi*t)) if(case==2) phi <- cbind(rep(1,J),sqrt(3)*(2*t-1), sqrt(5)*(6*t^2-6*t+1), sqrt(7)*(20*t^3-30*t^2+12*t-1)) ################################################### ######## Generate Data ############# ################################################### xi <- matrix(rnorm(I*N),I,N); xi <- xi%*%diag(sqrt(lambdaTrue)) X <- xi%*%t(phi); # of size I by J Y <- X + sigma*matrix(rnorm(I*J),I,J) results <- fpca2s(Y,npc=4,argvals=t) ################################################### #### SVDS ######## ################################################### Phi <- results$efunctions eigenvalues <- results$evalues for(k in 1:N){ if(Phi[,k]%*%phi[,k]< 0) Phi[,k] <- - Phi[,k] } ### plot eigenfunctions par(mfrow=c(N/2,2)) seq <- (1:(J/10))*10 for(k in 1:N){ plot(t[seq],Phi[seq,k]*sqrt(J),type='l',lwd = 3, ylim = c(-2,2),col = 'red', ylab = paste('Eigenfunction ',k,sep=''), xlab='t',main='SVDS') lines(t[seq],phi[seq,k],lwd = 2, col = 'black') }
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.