Kappa statistic
Jump to navigation
Jump to search

Interpretation
Landis and Koch[1] proposed the schema in the table below for interpreting values.
Interpretation | |
---|---|
< 0 | Poor agreement |
0.0 — 0.20 | Slight agreement |
0.21 — 0.40 | Fair agreement |
0.41 — 0.60 | Moderate agreement |
0.61 — 0.80 | Substantial agreement |
0.81 — 1.00 | Almost perfect agreement |
References
- ↑ Landis JR, Koch GG (1977). "The measurement of observer agreement for categorical data". Biometrics 33 (1): 159–74. PMID 843571. [e]