Cross-validation for the LASSO Kullback-Leibler divergence based regression
Cross-validation for the LASSO Kullback-Leibler divergence based regression.
cv.lasso.klcompreg(y, x, alpha = 1, nfolds = 10, folds = NULL, seed = FALSE, graph = FALSE)
y |
A numerical matrix with compositional data with or without zeros. |
x |
A matrix with the predictor variables. |
alpha |
The elastic net mixing parameter, with 0 ≤q α ≤q 1. The penalty is defined as a weighted combination of the ridge and of the Lasso regression. When α=1 LASSO is applied, while α=0 yields the ridge regression. |
nfolds |
The number of folds for the K-fold cross validation, set to 10 by default. |
folds |
If you have the list with the folds supply it here. You can also leave it NULL and it will create folds. |
seed |
If seed is TRUE the results will always be the same. |
graph |
If graph is TRUE (default value) a filled contour plot will appear. |
The K-fold cross validation is performed in order to select the optimal value for λ, the penalty parameter in LASSO.
The outcome is the same as in the R package glmnet. The extra addition is that if "graph = TRUE", then the plot of the cross-validated object is returned. The contains the logarithm of λ and the deviance. The numbers on top of the figure show the number of set of coefficients for each component, that are not zero.
Michail Tsagris and Abdulaziz Alenazi.
R implementation and documentation: Michail Tsagris mtsagris@uoc.gr and Abdulaziz Alenazi a.alenazi@nbu.edu.sa.
Friedman, J., Hastie, T. and Tibshirani, R. (2010) Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, Vol. 33(1), 1-22.
link{lasso.klcompreg}, lassocoef.plot, kl.compreg, ols.compreg, alfa.pcr, alfa.knn.reg
library(MASS) y <- rdiri( 214, runif(4, 1, 3) ) x <- as.matrix( fgl[, 2:9] ) x <- x / rowSums(x) mod <- cv.lasso.klcompreg(y, x)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.