Jean-Philippe Goldman Activities and Publications

UniGe Dpt of Linguistics

Inter-rater agreement

This plugin computes Cohen's Kappa and Fleiss' Kappa coefficient to measure the inter-rate agreement on the basis of similar tiers (say annotated syllable tiers). Simply provide the name of the raters tier and the possible answers. First, a table is created. Then, the kappa-score is computed.

Cohen's Kappa works for 2 raters and 2 categories of answers (in this implementation). The 2 columns of the intermediate table are the raters with 0s and 1s. You can also compute the score from 4 figures.

Fleiss' Kappa works for any number of raters, any number of categories. The columns of the Table represents the possible answers with the number of raters for each possible answers. See the demo tables in Kappa-Fleiss from Table.

KappaAgreement
< 0 Poor
0.01 - 0.20 Slight
0.21 - 0.40 Fair
0.41 - 0.60 Moderate
0.61 - 0.80 Substantial
0.81 - 1.00 Almost perfect

For some more information, read this:

Download

plugin_agreement.zip (7Ko)

Installation

Unzip. Copy in Praat folder (eg C:\Users\jpg\Praat). Relaunch Praat