Developer Tutorial: Using JMP Pro Generalized Regression to Analyze Designed Experiments
Design of Experiments
This session is for JMP users who understand basic DOE principles and have used JMP to design experiments.
Often observational data is gathered without involving the subject of the research or the data analyst. Such data can present analysis problems such as missing key factors, selection bias, multicollinearity and outliers.
Designing experiments is enriched by analyzing and building predictive models for correlated and high-dimensional data requires using variable selection techniques to select a subset of variables (predictors) to use in modeling a response variable. Shrinkage techniques like Lasso and Elastic Net are especially promising to avoid overfitting observational data. Variable selection also plays a key role in analyzing designed experiments, but the strategies and techniques used can be slightly different.
Generalized Regression in JMP Pro is useful for many modeling situations that include and go beyond variable selection, or when you suspect collinearity in your predictors. It also lets you specify a variety of distributions for continuous, binomial, count, or zero-inflated responses and when you want to fit models that you compare to models obtained using other techniques.
In this session, you will learn how to use JMP Pro’s Generalized Regression to make the most of your designed experiments. We will present and describe the variable selection techniques and model diagnostics in Generalized Regression that will help you find an appropriate model for your data among a sea of possibilities.
This JMP Developer Tutorial covers: Stepwise, Best Subset, Forward and Backward selection; Dantzig selector estimation; Information criteria.