Presented By: Department of Psychology
Methods Hour: Interrater Reliability: The limitations of Cohen's kappa in classroom observation research
Dr. Kai Cortina, Social Work and Psychology Director, Professor of Psychology and Blake Ebright, CPEP Graduate Student
Abstract:
Cohen's kappa is the coefficient commonly reported (and requested by reviewers) as measure of the consistency of two and more raters. Usually, the objects to be rated are clearly defined or given. Our study design was more complex: Two raters identified misbehaviors in classroom video and classified them according to a coding scheme. How to measure/report interrater reliability? Is it ok to base Cohen's kappa (or Krippendorff's alpha) only on those roughly 20% of misbehaviors that both raters identified and coded? Are there alternative approaches for future studies?
Cohen's kappa is the coefficient commonly reported (and requested by reviewers) as measure of the consistency of two and more raters. Usually, the objects to be rated are clearly defined or given. Our study design was more complex: Two raters identified misbehaviors in classroom video and classified them according to a coding scheme. How to measure/report interrater reliability? Is it ok to base Cohen's kappa (or Krippendorff's alpha) only on those roughly 20% of misbehaviors that both raters identified and coded? Are there alternative approaches for future studies?
Co-Sponsored By
Livestream Information
LivestreamFebruary 12, 2021 (Friday) 12:00pm
Joining Information Not Yet Available
Explore Similar Events
-
Loading Similar Events...