We calculate the inter-annotator agreement using Fleiss’ kappa [2], a statistical measure to compute the agreement for three or more annotators. For the annotation of the primary symptoms, we measure a kappa value of\(\kappa =0.61\), which indicates a substantial agreement between the three ...
CNSA 2010. Communications in Computer and Information Science, vol 89. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-14478-3_13 Download citation .RIS .ENW .BIB DOIhttps://doi.org/10.1007/978-3-642-14478-3_13 Publisher NameSpringer, Berlin, Heidelberg Print ISBN978-3-...