The negative sum of natural logs of the observed probabilities is called the negative log-likelihood (–LogLikelihood). The negative log-likelihood for categorical data plays the same role as sums of squares in continuous data: twice the difference in the negative log-likelihood from the model fitted by the data and the model with equal probabilities is a Chi-square statistic. This test statistic examines the hypothesis that the x variable has no effect on the responses.
Values of the RSquare (U) (sometimes denoted as R2) range from 0 to 1. High R2 values are indicative of a good model fit, and are rare in categorical models.
 – The Reduced model only contains an intercept.
 – The Full model contains all of the effects as well as the intercept.
 – The Difference is the difference of the log-likelihoods of the full and reduced models.
Measures variation, sometimes called uncertainty, in the sample.
Full (the full model) is the negative log-likelihood (or uncertainty) calculated after fitting the model. The fitting process involves predicting response rates with a linear model and a logistic response function. This value is minimized by the fitting process.
Reduced (the reduced model) is the negative log-likelihood (or uncertainty) for the case when the probabilities are estimated by fixed background rates. This is the background uncertainty when the model has no effects.
For more information, see Likelihood, AICc, and BIC in the Fitting Linear Models book.
The likelihood ratio Chi-square test of the hypothesis that the model fits no better than fixed response rates across the whole sample. It is twice the –LogLikelihood for the Difference Model. It is two times the difference of two negative log-likelihoods, one with whole-population response probabilities and one with each-population response rates. For more information, see Whole Model Test Report.
The observed significance probability, often called the p value, for the Chi-square test. It is the probability of getting, by chance alone, a Chi-square value greater than the one computed. Models are often judged significant if this probability is below 0.05.
The proportion of the total uncertainty that is attributed to the model fit, defined as the Difference negative log-likelihood value divided by the Reduced negative log-likelihood value. An RSquare (U) value of 1 indicates that the predicted probabilities for events that occur are equal to one: There is no uncertainty in predicted probabilities. Because certainty in the predicted probabilities is rare for logistic models, RSquare (U) tends to be small. For more information, see Whole Model Test Report.
Note: RSquare (U) is also know as McFadden’s pseudo R-square.

Help created on 9/19/2017