Modeling techniques such as the Elastic Net and the Lasso are particularly promising for large data sets, where collinearity is typically a problem. In fact, modern data sets often include more variables than observations. This situation is sometimes referred to as the p > n problem, where n is the number of observations and p is the number of predictors. Such data sets require variable selection if traditional modeling techniques are to be used.
The Elastic Net and Lasso are relatively recent techniques (Tibshirani, 1996, Zou and Hastie, 2005). Both techniques penalize the size of the model coefficients, resulting in a continuous shrinkage. The amount of shrinkage is determined by a tuning parameter. An optimal level of shrinkage is determined by one of several validation methods. Both techniques have the ability to shrink coefficients to zero. In this way, variable selection is built into the modeling procedure. The Elastic Net model subsumes both the Lasso and ridge regression as special cases. For details, see Statistical Details for Estimation Methods.
The Generalized Regression personality also fits an adaptive version of the Lasso and the Elastic Net. These adaptive versions attempt to penalize active variables less than inactive variables. The adaptive versions were developed to ensure that the oracle property holds. The oracle property guarantees the following: Asymptotically, your estimates are what they would have been had you known that predictors were active contributors to the model. More specifically, your model correctly identifies the predictors that should have zero coefficients. Your estimates converge to those that would have been obtained had you started with only the active predictors. See Adaptive Methods.