Skip to Content

Sponsors

No results

Tags

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Department of Psychology

Methods Hour: Interrater Reliability: The limitations of Cohen's kappa in classroom observation research

Dr. Kai Cortina, Social Work and Psychology Director, Professor of Psychology and Blake Ebright, CPEP Graduate Student

Methods Hours Methods Hours
Methods Hours
Abstract:
Cohen's kappa is the coefficient commonly reported (and requested by reviewers) as measure of the consistency of two and more raters. Usually, the objects to be rated are clearly defined or given. Our study design was more complex: Two raters identified misbehaviors in classroom video and classified them according to a coding scheme. How to measure/report interrater reliability? Is it ok to base Cohen's kappa (or Krippendorff's alpha) only on those roughly 20% of misbehaviors that both raters identified and coded? Are there alternative approaches for future studies?
Methods Hours Methods Hours
Methods Hours

Livestream Information

 Livestream
February 12, 2021 (Friday) 12:00pm
Joining Information Not Yet Available

Explore Similar Events

  •  Loading Similar Events...

Tags


Back to Main Content