๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

P113 Evaluating interrater agreement based on categorical scales

โœ Scribed by Paul Kotey; Audrey Yu; Bruce Binkowitz


Book ID
113298383
Publisher
Elsevier Science
Year
1995
Tongue
English
Weight
74 KB
Volume
16
Category
Article
ISSN
0197-2456

No coin nor oath required. For personal study only.


๐Ÿ“œ SIMILAR VOLUMES


The measurement of interobserver agreeme
โœ M.A.A. Moussa ๐Ÿ“‚ Article ๐Ÿ“… 1985 ๐Ÿ› Elsevier Science โš– 340 KB

The Kappa statistic is used to measure the interobserver similarity based on categorical scales. The cases of two or more observers with two or more rating categories are considered. Allowance is made for the attachment of disagreement weights, based on rational or clinical grounds, to different rat