Fits Elastic Net regression models
Starting from zero, the LARS-EN algorithm provides the entire sequence of coefficients and fits.
enet(x, y, lambda, max.steps, normalize=TRUE, intercept=TRUE, trace = FALSE, eps = .Machine$double.eps)
x |
matrix of predictors |
y |
response |
lambda |
Quadratic penalty parameter. lambda=0 performs the Lasso fit. |
max.steps |
Limit the number of steps taken; the default is |
trace |
If TRUE, prints out its progress |
normalize |
Standardize the predictors? |
intercept |
Center the predictors? |
eps |
An effective zero |
The Elastic Net methodology is described in detail in Zou and Hastie (2004). The LARS-EN algorithm computes the complete elastic net solution simultaneously for ALL values of the shrinkage parameter in the same computational cost as a least squares fit. The structure of enet() is based on lars() coded by Efron and Hastie. Some internel functions from the lars package are called. The user should install lars before using elasticnet functions.
An "enet" object is returned, for which print, plot and predict methods exist.
Hui Zou and Trevor Hastie
Zou and Hastie (2005) "Regularization and Variable Selection via the Elastic Net" Journal of the Royal Statistical Society, Series B, 67, 301-320.
print, plot, and predict methods for enet
data(diabetes) attach(diabetes) ##fit the lasso model (treated as a special case of the elastic net) object1 <- enet(x,y,lambda=0) plot(object1) ##fit the elastic net model with lambda=1. object2 <- enet(x,y,lambda=1) plot(object2) ##early stopping after 50 LARS-EN steps object4 <- enet(x2,y,lambda=0.5,max.steps=50) plot(object4) detach(diabetes)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.