In JMP Pro, the Fit Model platform’s Generalized Regression personality provides shrinkage techniques that specifically address modeling correlated and high-dimensional data. Two of these techniques, the Lasso and the Elastic Net, perform variable selection as part of the modeling procedure.
Large data sets that contain many variables typically evidence multicollinearity issues. Modern data sets often include more variables than observations, requiring variable selection if traditional modeling techniques are to be used. The presence of multicollinearity and a profusion of predictors exposes the shortcomings of classical techniques.
Even for small data sets with little or no correlation, including designed experiments, the Lasso and Elastic Net are useful. They can be used to obtain better predictive models or to select variables for model reduction or for future study.
The Generalized Regression personality is useful for many modeling situations. This personality enables you to specify a variety of distributions for your response variable. Use it when your response is continuous, binomial, a count, or a zero-inflated count. Use it when you are interested in variable selection or when you suspect collinearity in your predictors. More generally, use it to fit models that you compare to models obtained using other techniques.