Main Article Content
High-Fidelity simulation, Evaluation Scale, Medical Evaluation, Acute Care, CanMEDS
Introduction: High-fidelity simulation is an efficient and holistic teaching method. However, assessing simulation performances remains a challenge. We aimed to develop a CanMEDS competency-based global rating scale for internal medicine trainees during simulated acute care scenarios.
Methods: Our scale was developed using a formal Delphi process. Validity was tested using six videotaped scenarios of two residents managing unstable atrial fibrillation, rated by 6 experts. Psychometric properties were determined using a G-study and a satisfaction questionnaire.
Results: Most evaluators favorably rated the usability of our scale, and attested that the tool fully covered CanMEDS competencies. The scale showed low to intermediate generalization validity.
Conclusions: This study demonstrated some validity arguments for our scale. The best assessed aspect of performance was communication; further studies are planned to gather further validity arguments for our scale and to compare assessment of teamwork and communication during scenarios with multiple versus single residents.
2. Regehr G, et al. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998; 73(9): 993-7.
3. Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg 2007; 193(5): 551-5.
4. Kneebone R, et al. An innovative model for teaching and learning clinical procedures. Med Educ 2002; 36(7): 628-34.
5. Bruppacher HR, et al. Simulation-based training improves physicians' performance in patient care in high-stakes clinical setting of cardiac surgery. Anesthesiology 2010; 112(4): 985-92.
6. Yee B, et al. Nontechnical skills in anesthesia crisis management with repeated exposure to simulation-based education. Anesthesiology 2005; 103(2): 241-8.
7. McGaghie WC, et al. Does Simulation-based Medical Education with Deliberate Practice Yield Better Results than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad Med 2011; 86(6): 706-11.
8. Ker JS, et al. Can a ward simulation exercise achieve the realism that reflects the complexity of everyday practice junior doctors encounter? Med Teach 2006; 28(4): 330-4.
9. Khan K, Pattison T, Sherwood M. Simulation in medical education. Med Teach 2011; 33(1): 1-3.
10. Weller JM, et al. Evaluation of high fidelity patient simulator in assessment of performance of anaesthetists. Br J Anaesth 2003; 90(1): 43-7.
11. Klampfer B, et al. Enhancing Performance in High Risk Environments: Recommendations for the use of Behavioural Markers. In: Gottlieb Deimler and Kerl Benz foundation Kolleg group interaction in High Risk environnement (GIHRE). Behavioural Markers Workshop. Zurich: Swissair Training Center; 2001.
12. Patey R, et al. Developing a Taxonomy of Anesthetists' Nontechnical Skills (ANTS). In: Henriksen K, et al, eds. Advances in Patient Safety: From Research to Implementation, Volume 4: Programs, Tools, and Products. Rockville: Agency for Healthcare Research and Quality; 2005.
13. Flin R, et al. Anaesthetists' non-technical skills. Br J Anaesth 2010; 105(1): 38-44.
14. Yule S, et al. Development of a rating system for surgeons' non-technical skills. Med Educ 2006; 40(11): 1098-104.
15. Fletcher G, et al. Anaesthetists' Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth 2003; 90(5): 580-8.
16. Frank JR, Snell L, Sherbino J, editors. Can Meds 2015 Physician Competency Framework. Ottawa: Royal College of Physicians and Surgeons of Canada; 2015.
17. Neira VM, et al. "GIOSAT": a tool to assess CanMEDS competencies during simulated crises. Can J Anaesth 2013; 60(3): 280-9.
18. Hatala R, et al. Constructing a validity argument for the Objective Structured Assessment of Technical Skills (OSATS): a systematic review of validity evidence. Adv Health Sci Educ Theory Pract 2015; 20(5): 1149-75.
19. Cook DA, et al. A contemporary approach to validity arguments: a practical guide to Kane's framework. Med Educ 2015; 49(6): 560-75.
20. Tavares W, et al. Applying Kane's validity framework to a simulation based assessment of clinical competence. Adv Health Sci Educ Theory Pract 2017; [Epub ahead of print]
21. Hall AK, et al. Queen's simulation assessment tool: development and validation of an assessment tool for resuscitation objective structured clinical examination stations in emergency medicine. Simul Healthc 2015; 10(2): 98-105.
22. Hodges B, McIlroy JH. Analytic global OSCE ratings are sensitive to level of training. Med Educ 2003; 37(11): 1012-6.
23. Morgan PJ, et al. Nontechnical skills assessment after simulation-based continuing medical education. Simul Healthc 2011; 6(5): 255-9.
24. Preusche I, Schmidts M, Wagner-Menghin M. Twelve tips for designing and implementing a structured rater training in OSCEs. Med Teach 2012; 34(5): 368-72.
25. Cardinet J, Johnson S, and Pini G. Applying generalizability theory using EduG. New York:Taylor & Francis; 2010.
26. Welke TM, et al. Personalized oral debriefing versus standardized multimedia instruction after patient crisis simulation. Anesth Analg 2009; 109(1): 183-9.
27. Savoldelli GL, et al. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology 2006; 105(2): 279-85.
28. Feldman M, et al. Rater training to support high-stakes simulation-based assessments. J Contin Educ Health Prof 2012; 32(4): 279-86.
29. Boulet JR, Murray DJ. Simulation-based assessment in anesthesiology: requirements for practical implementation. Anesthesiology 2010; 112(4): 1041-52.
30. Kim J, et al. A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Crit Care Med 2006; 34(8): 2167-74.
31. Setyonugroho W, Kennedy KM, Kropmans TJ. Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: A systematic review. Patient Educ Couns 2015; 98(12): 1482-91.
32. Hamstra SJ. Keynote address: the focus on competencies and individual learner assessment as emerging themes in medical education research. Acad Emerg Med 2012; 19(12): 1336-43.
33. Harris K, Frank J, eds. Competence by design: Reshaping Canadian medical education. Ottawa:Royal College of Physicians and Surgeons of Canada; 2014.