For the latest version of JMP Help, visit JMP.com/help.


Predictive and Specialized Modeling > Model Comparison > Model Comparison Platform Options
Publication date: 11/29/2021

Image shown hereModel Comparison Platform Options

Image shown hereContinuous and Categorical Responses

Model Averaging

Makes a new column of the arithmetic mean of the predicted values (for continuous responses) or the predicted.probabilities (for categorical responses).

Image shown hereContinuous Responses

Plot Actual by Predicted

Shows a scatterplot of the actual versus the predicted values. The plots for the different models are overlaid.

Plot Residual by Row

Shows a plot of the residuals by row number. The plots for the different models are overlaid.

Profiler

Shows a profiler for each response based on prediction formula columns in your data. The profilers have a row for each model being compared.

Image shown hereCategorical Responses

ROC Curve

Shows ROC curves for each level of the response variable. The curves for the different models are overlaid.

AUC Comparison

Provides a comparison of the area under the ROC curve (AUC) from each model. The area under the curve is the indicator of the goodness of fit, where 1 is a perfect fit.

The report includes the following information:

standard errors and confidence intervals for each AUC

standard errors, confidence intervals, and hypothesis tests for the difference between each pair of AUCs

an overall hypothesis test for testing whether all AUCs are equal

Lift Curve

Shows lift curves for each level of the response variable. The curves for the different models are overlaid.

Cum Gains Curve

Shows cumulative gains curves for each level of the response variable. A cumulative gains curve is a plot of the proportion of a response level that is identified by the model against the proportion of all responses. A cumulative gains curve for a perfect model would reach 1.0 at the overall proportion of the response level. The curves for the different models are overlaid.

Confusion Matrix

Shows confusion matrices for each model. A confusion matrix is a two-way classification of actual and predicted responses. Count and rate confusion matrices are shown. Separate confusion matrices are produced for each level of the Group variable.

If the response has a Profit Matrix column property, then Actual by Decision Count and Actual by Decision Rate matrices are shown to the right of the confusion matrices. For more information about these matrices, see Additional Examples of Partitioning.

Profiler

Shows a profiler for each response based on prediction formula columns in your data. The profilers have a row for each model being compared.

Decision Threshold

(Available only for binary categorical responses.) Shows or hides Decision Thresholds reports for the training, validation, and test sets, if specified. Each report contains a graph of the distribution of fitted probabilities for each model, confusion matrices for each model, and classification graphs to compare the model fits. See Decision Thresholds Report.

Related Information

ROC Curve

Lift Curve

Want more information? Have questions? Get answers in the JMP User Community (community.jmp.com).