Hi Martin,
> Also, is there any way to consider agreement between overlapping annotations? This is an important issue because we have annotated many phrases, resulting in different annotations for different annotators that overlap.
For features on span layers, there should be the option to select the "Krippendorff's Alpha (unitizing / character offsets)" agreement which should take partial overlap into account.
> As I understand it, most of the different agreement scores used do not include incomplete annotations (when a position has only been annotated by one annotator and not the second). Krippendorff's Alpha may or may not include these cases. So why do all my scores have the same value ? The Krippendorff (with or without the incomplete annotations), Cohen and Fleiss all have exactly the same values.
For a real-world annotation project, I would assume that not all the agreement measures yield the same results. Might be you hit a bug. But before looking into that: which version of WebAnno are you using?
Cheers,
-- Richard