Hello,
I am calculating Kappas to assess inter-rater agreement on specific features in radiographs; there are 6 raters, 5 possible outcomes, some missing data (i.e. not all raters rated all radiographs). Using
kap rater1 rater2 rater3 rater4 rater5 rater6
produces a nice table with Kappas for each rating category as well as a combined Kappa. I then use the 'kapci' command to produce a confidence interval for the overall/combined Kappa. So far so good!
However, I'd also like to calculate confidence intervals for the Kappas for each rating category - does anyone know how to do that, or if that is even possible? I've drawn a blank searching the documentation and this forum. I am aware of kappaetc and kappa2 commands but neither seem to have an option to calculate the rating-specific CIs.
I am calculating both unweighted and weighted Kappas, in case that matters.
Many thanks,
Kristien
I am calculating Kappas to assess inter-rater agreement on specific features in radiographs; there are 6 raters, 5 possible outcomes, some missing data (i.e. not all raters rated all radiographs). Using
kap rater1 rater2 rater3 rater4 rater5 rater6
produces a nice table with Kappas for each rating category as well as a combined Kappa. I then use the 'kapci' command to produce a confidence interval for the overall/combined Kappa. So far so good!
However, I'd also like to calculate confidence intervals for the Kappas for each rating category - does anyone know how to do that, or if that is even possible? I've drawn a blank searching the documentation and this forum. I am aware of kappaetc and kappa2 commands but neither seem to have an option to calculate the rating-specific CIs.
I am calculating both unweighted and weighted Kappas, in case that matters.
Many thanks,
Kristien
Comment