首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Measurement of interrater agreement with adjustment for covariates
Authors:Barlow W
Institution:Center for Health Studies, Group Health Cooperative, 1730 Minor Avenue, Suite 1600, Seattle, Washington 98101-1448, USA.
Abstract:The kappa coefficient measures chance-corrected agreement between two observers in the dichotomous classification of subjects. The marginal probability of classification by each rater may depend on one or more confounding variables, however. Failure to account for these confounders may lead to inflated estimates of agreement. A multinomial model is used that assumes both raters have the same marginal probability of classification, but this probability may depend on one or more covariates. The model may be fit using software for conditional logistic regression. Additionally, likelihood-based confidence intervals for the parameter representing agreement may be computed. A simple example is discussed to illustrate model-fitting and application of the technique.
Keywords:
本文献已被 PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号