The following article requires a subscription:



(Format: HTML, PDF)

Context: Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education.

Objective: To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention.

Data Source: Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011.

Study Selection: Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals.

Data Extraction: Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors.

Data Synthesis: From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality.

Conclusion: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.

Copyright 2011 by the American Medical Association. All Rights Reserved. Applicable FARS/DFARS Restrictions Apply to Government Use. American Medical Association, 515 N. State St, Chicago, IL 60610.