For the latest version of JMP Help, visit JMP.com/help.


Basic Analysis > Contingency Analysis > Statistical Details for the Contingency Platform > Statistical Details for the Agreement Statistic
Publication date: 05/05/2023

Statistical Details for the Agreement Statistic

This section contains details for the agreement statistic in the Contingency platform. Viewing the two response variables as two independent ratings of the n subjects, the Kappa coefficient equals +1 when there is complete agreement of the raters. When the observed agreement exceeds the amount of agreement expected just by chance, the Kappa coefficient is positive and its magnitude reflects the strength of agreement. Although unusual in practice, Kappa is negative when the observed agreement is less than the amount of agreement expected just by chance. The minimum value of Kappa depends on the marginal proportions, but it is always between -1 and 0.

The Kappa coefficient is computed as follows:

Equation shown here where Equation shown here and Equation shown here

Note that Equation shown here is the proportion of subjects in the Equation shown hereth cell, such that Equation shown here.

The asymptotic variance of the simple kappa coefficient is estimated by the following:

Equation shown here where Equation shown here, Equation shown here and

Equation shown here

See Cohen (1960) and Fleiss et al. (1969).

For Bowker’s test of symmetry, the null hypothesis is that the probabilities in the two-by-two table satisfy symmetry (pij=pji).

Want more information? Have questions? Get answers in the JMP User Community (community.jmp.com).