Publication date: 08/13/2020

Agreement Reports

Note: The Kappa value is a statistic that expresses agreement. The closer the Kappa value is to 1, the more agreement there is. A Kappa value closer to 0 indicates less agreement.

The Agreement Report shows agreement summarized for each rater and overall agreement. This report is a numeric form of the data presented in the second chart in the Attribute Gauge Chart report (Figure 6.5).

The Agreement Comparisons report shows each rater compared with all other raters, using Kappa statistics. The rater is compared with the standard only if you have specified a Standard variable in the launch window.

The Agreement within Raters report shows the number of items that were inspected. The confidence intervals are score confidence intervals, as suggested by Agresti and Coull (1998). The Number Matched is the sum of the number of items inspected, where the rater agreed with him or herself on each inspection of an individual item. The Rater Score is the Number Matched divided by the Number Inspected.

The Agreement across Categories report shows the agreement in classification over that which would be expected by chance. It assesses the agreement between a fixed number of raters when classifying items.

Figure 6.6 Agreement ReportsĀ 

Want more information? Have questions? Get answers in the JMP User Community (