Breaking Down the Objective Structured Clinical Examination: An Evaluation of the Helping Babies Breathe OSCEs.
Seto, Teresa L. MD; Tabangin, Meredith E. MPH; Taylor, Kathryn K. BS; Josyula, Srirama BS; Vasquez, Juan Carlos MD; Kamath-Rayne, Beena D. MD, MPH
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare.
12(4):226-232, August 2017.
(Format: HTML, PDF)
Introduction: Helping Babies Breathe (HBB) is a simulation-based neonatal resuscitation curriculum designed for low-resource settings. At the completion of the workshop, learners complete the following four assessments: a multiple-choice question (MCQ) test, bag-mask ventilation (BMV) checklist, and two objective structured clinical examinations (OSCEs). Objective structured clinical examinations are clinical performance assessments that evaluate learners' skills in simulated scenarios. The aims of this study were (1) to evaluate the validity and reliability of the OSCEs used in the HBB curriculum, (2) to conduct an itemized analysis of the OSCEs to identify specific deficits in knowledge and performance, and to identify areas of improvement for future versions of HBB.
Methods: Seventy physicians and nurses completed an HBB workshop conducted in Spanish at a Honduran community hospital. Validity and reliability were examined using an item analysis of item difficulty, discrimination, correlation, and internal consistency/reliability.
Results: Posttest scores were higher for all assessments. Most items on the OSCEs were of low difficulty and low discrimination. Item agreement was lowest for multistep items.
Conclusions: As summative and formative assessments of performance in simulated neonatal resuscitation, the HBB OSCEs are effective because most learners were able to perform the skills correctly after an HBB workshop. On the basis of our results, we recommend changes to future editions of HBB, including the following: simplification of multistep items to single tasks, use of a global rating scale, provision of additional scenarios, and specific instructions to raters on how to grade OSCEs and promote self-reflection to enhance debriefings/feedback. Further validation and study of the OSCEs in the second edition of HBB would enhance their quality and translation into clinical performance.
(C) 2017 Society for Simulation in Healthcare