학술논문

Interrater reliability in neurology objective structured clinical examination across specialties.
Document Type
Article
Source
Medical Teacher. Feb2024, Vol. 46 Issue 2, p239-244. 6p.
Subject
*OCCUPATIONAL roles
*NEUROLOGY
*CONFIDENCE intervals
*MULTIVARIATE analysis
*REGRESSION analysis
*INTER-observer reliability
*EDUCATIONAL tests & measurements
*CLINICAL competence
*INTRACLASS correlation
*EDUCATORS
*NEUROLOGIC examination
*MEDICAL specialties & specialists
*VIDEO recording
*EVALUATION
Language
ISSN
0142-159X
Abstract
To assess interrater reliability and examiners' characteristics, especially specialty, associated with scoring of neurology objective structured clinical examination (OSCE). During a neurology mock OSCE, five randomly chosen students volunteers were filmed while performing 1 of the 5 stations. Video recordings were scored by physicians from the Lyon and Clermont-Ferrand university teaching hospitals to assess students performance using both a checklist scoring and a global rating scale. Interrater reliability between examiners were assessed using intraclass coefficient correlation. Multivariable linear regression models including video recording as random effect dependent variable were performed to detect factors associated with scoring. Thirty examiners including 15 (50%) neurologists participated. The intraclass correlation coefficient of checklist scores and global ratings between examiners were 0.71 (CI95% [0.45-0.95]) and 0.54 (CI95% [0.28-0.91]), respectively. In multivariable analyses, no factor was associated with checklist scores, while male gender of examiner was associated with lower global rating (β coefficient = –0.37; CI 95% [–0.62–0.11]). Our study demonstrated through a video-based scoring method that agreement among examiners was good using checklist scoring while moderate using global rating scale in neurology OSCE. Examiner's specialty did not affect scoring whereas gender was associated with global rating scale. [ABSTRACT FROM AUTHOR]