Kappa

From Scientificmetho

The Kappa Coefficient, also referred to as the Kappa Statistic, is a measure signifying degree of rater agreement judging the same phenomenon. In other words, the Kappa Coefficient is an indication of Inter-Rater Reliability, which denotes the extent to which there is consistency of ratings across raters. It increases if raters are well-trained and know they are being observed.

The Kappa Coefficient is applicable only when the data are nominal, ordinal, or discontinuous. The Kappa Coefficient can range from -1 (complete disagreement) to +1 (perfect agreement). A measurement in the upper 0.80 - 0.90 range is generally viewed as an adequate degree of consistency across raters.

Kappa Increases when raters are well-trained and aware of being observed

Personal tools