Modeling techniques such as the Elastic Net and the Lasso are particularly promising for large data sets, where collinearity is typically a problem. In fact, modern data sets often include more variables than observations. This situation is sometimes referred to as the p > n problem, where n is the number of observations and p is the number of predictors. Such data sets require variable selection if traditional modeling techniques are to be used.
•

The Elastic Net and Lasso are relatively recent techniques (Tibshirani, 1996, Zou and Hastie, 2005). Both techniques penalize the size of the model coefficients, resulting in a continuous shrinkage. The amount of shrinkage is determined by a tuning parameter. An optimal level of shrinkage is determined by one of several validation methods. Both techniques have the ability to shrink coefficients to zero. In this way, variable selection is built into the modeling procedure. The Elastic Net model subsumes both the Lasso and ridge regression as special cases. For details, see Statistical Details for Estimation Methods.
•

The Lasso has two shortcomings. When several variables are highly correlated, it tends to select only one variable from that group. When the number of variables, p, exceeds the number of observations, n, the Lasso selects at most n predictors.

•

The Generalized Regression personality also fits an adaptive version of the Lasso and the Elastic Net. These adaptive versions attempt to penalize active variables less than inactive variables. The adaptive versions were developed to ensure that the oracle property holds. The oracle property guarantees the following: Asymptotically, your estimates are what they would have been had you known that predictors were active contributors to the model. More specifically, your model correctly identifies the predictors that should have zero coefficients. Your estimates converge to those that would have been obtained had you started with only the active predictors. See Adaptive Methods.
The Generalized Regression personality enables you to specify a variety of distributions for your response variable. The distributions fit include normal, Cauchy, exponential, gamma, beta, binomial, beta binomial, Poisson, negative binomial, zeroinflated binomial, zeroinflated beta binomial, zeroinflated Poisson, zeroinflated negative binomial, and zeroinflated gamma. This flexibility enables you to fit categorical and count responses, as well as continuous responses, and specifically, rightskewed continuous responses. You can also perform quantile regression. The personality provides a variety of validation criteria for model selection and supports training, validation, and test columns. See Distribution.