Cohen's Kappa


Output: Press calculate

Formula: κ = (Po - Pe) / (1 - Pe)

Cohen's Kappa (κ) is a statistical measure of inter-rater reliability for categorical items. It is generally thought to be a more robust measure than simple percent agreement calculation, since κ takes into account the agreement occurring by chance. The formula involves two main components: Po, the observed agreement among raters, and Pe, the hypothetical probability of chance agreement. Using these, the Kappa statistic is calculated with the formula above, where κ results in a value between -1 and 1. A value of 1 indicates perfect agreement, while a value of 0 indicates no agreement above what would be expected by chance, and negative values indicate systematic disagreement. This measure is useful in fields such as psychology, where researchers need to ensure that observations or ratings are consistent across different observers.

Tags: Statistics, Reliability, Cohen S Kappa, Inter Rater Agreement