MATLAB Answers

How to calculate for significant difference between Cohen's Kappa values?

2 ビュー (過去 30 日間)
Leonard Hickman
Leonard Hickman 2021 年 9 月 6 日
回答済み: Jeff Miller 2021 年 9 月 14 日
I have calculated the Cohen's Kappa value determining agreement between Test A and Test B, as well as Cohen's Kappa for agreement between Test A and Test C. What method would I use to calculate for a significant difference in Kappa values between agreement for A-B compard to A-C? Are there any existing scripts/functions available for this?
  2 件のコメント
Leonard Hickman
Leonard Hickman 2021 年 9 月 13 日
Two separate samples, one sample that underwent tests A & B and one sample that underwent tests A & C.

サインインしてコメントする。

回答 (3 件)

Star Strider
Star Strider 2021 年 9 月 6 日
編集済み: Star Strider 2021 年 9 月 13 日
I used Cohen’s κ many years ago. From my understanding, from reading Fliess’s book (and correspoinding with him), Cohen’s κ is normally distributed. An excellent (in my opinion) and free resource is: Interrater reliability: the kappa statistic . There are others, although not all are free.
EDIT — (13 Sep 2021 at 10:58)
To get p-values and related statistics for normally-distributed variables, the ztest function would likely be appropriate.
.

Ive J
Ive J 2021 年 9 月 12 日
You can build confidence intervals around your Kappa values, and then see if they overlap.
  2 件のコメント
Ive J
Ive J 2021 年 9 月 13 日
You may want to take a look at this thread. Then you can calculate the z-score and get a p-value out of this.
pval = 2*normcdf(zvalue, 'upper'); % two-sided test

サインインしてコメントする。


Jeff Miller
Jeff Miller 2021 年 9 月 14 日
As I understand it, the fundamental question is whether tests A & B agree better than tests A & C, beyond a minor improvement that could just be due to chance (or agree worse, depending on how the tests B and C are labelled). The null hypothesis is that the agreement between A & B is equal to the agreement between A & C.
The most straightforward test for this case is the chi-square test for independence. Imagine the data summarized in a 2x2 table like this:
% Tests agree Tests disagree
% A & B group: 57 17
% A & C group: 35 8
with total N's of 74 in the first group and 43 in the second group. MATLAB's 'crosstab' command will compute that chi-square test for you. See this answer for an explanation of how to format the data and run the test.
Cohen's Kappa is a useful numerical measure of the extent of agreement, but it isn't really optimal for deciding whether the levels of agreement are different for the two pairs of tests.

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by