Modeling techniques such as the Elastic Net and the Lasso are particularly promising for large data sets, where collinearity is typically a problem. In fact, modern data sets often include more variables than observations. This situation is sometimes referred to as the p > n problem, where n is the number of observations and p is the number of predictors. Such data sets require variable selection if traditional modeling techniques are to be used.
The Generalized Regression personality also fits an adaptive version of the Lasso and the Elastic Net. These adaptive versions attempt to penalize active variables less than inactive variables. The adaptive versions were developed to ensure that the oracle property holds. The oracle property guarantees the following: Asymptotically, your estimates are what they would have been had you known in advance which predictors were active contributors to the model. More specifically: