Andrew Hall: Development and Evaluation of a National Simulation-Based Examination in Emergency Medicine

Introduction:   Facing concerns about deficiencies in current assessment processes and calls for reform(1), postgraduate medical education is transitioning to Competency Based Medical Education (CBME).  CBME aims to meet societal demands for increased accountability with a renewed commitment to the assessment of competence.(2) Workplace-based assessment in isolation is fraught with difficulties and existing high-stakes examinations have limitations.  Having been utilized for assessment by other industries,(3) health professions,(4) and specialties(5), simulation has been called upon as a solution, as it provides a unique opportunity for standardized assessment, independent of actual clinical care.  

Objectives: Using rigorous educational research methodologies and capitalizing on investigator expertise from across Canada, our study aims to design, implement, and evaluate a novel summative standardized national simulation-based examination across EM training sites in Canada.  


Phase 1 –Content Selection:  A content sampling strategy including recently derived national EM simulation curricular content(6) and the FRCPC-EM EPAs(7, 8) will be employed to determine priority examination content.  A modified Delphi process(9-11) with iterative rounds of questionnaires and controlled feedback will be conducted with purposeful sampling to ensure broad stakeholder participation, including program directors and the EM specialty and exam committees.  

Phase 2 – Assessment Design: Blueprinting will be used to design a series of stations (12), to be administered as a standardized simulation-based objective structured clinical examination (OSCE).

Phase 3 – Assessment Implementation: The simulation-based OSCE will be piloted, then administered locally at 16 of the 17 FRCPC-EM training sites in Canada to all PGY-3 residents and a subset of PGY 1-5 residents.  Performances will be assessed in real-time by local faculty and video recorded for review by an external blinded assessor. Assessors will undergo standardized rater training(13), and utilize the Resuscitation Assessment Tool (13-15) with the addition of the Ottawa Surgical Competency Operating Room Evaluation (OSCORE) for entrustment.(16-18)  

Phase 4 – Assessment Evaluation: A rigorous argument for validity of the assessment will be constructed using Kane’s validity framework,(4, 19) specifically measuring interrater reliability, discriminatory capabilities, generalizability and variance partitioning (G- and D-Studies), and correlation with other accessible variables (EPA assessments, in-training exams, etc.).  Pre- and post-examination survey results will inform additional evaluation using Norcini’s Consensus Framework for Good Assessment(20), including feasibility, educational effect, catalytic effects, and acceptability.  

Phase 5 – Longitudinal Evaluation: Trainee performances will be compared to their Royal College examination and their program’s progression decisions.  Additionally, we will interview stakeholders at one- and two-years post implementation to understand the utility of this simulation-based examination on the certification process. 

Impact:  This project has been endorsed by the Royal College, the FRCPC-EM National Specialty Committee and the EM Simulation Educators Research Collaborative (EM-SERC) as a priority innovation.(21)  With the ultimate inclusion of simulation in our EM certification process, we signal core priorities in our medical curricula (22), and emphasize competencies previously not well-assessed, such as communication and crisis leadership.  This will move our national program of assessment into the modern age, where technology is thoughtfully harnessed to ensure the graduation of competent trainees ready to meet the demands of independent practice in Canada.  


  1. Cooke M, Irby DM, O’Brien BC. Educating physicians: a call for reform of medical school and residency: John Wiley & Sons; 2010.
  2. Holmboe ES, Sherbino J, Englander R, Snell L, Frank JR, Collaborators I. A call to action: The controversy of and rationale for competency-based medical education. Med Teach. 2017;39(6):574-81.
  3. Salas E, Bowers CA, Rhodenizer L. It is not how much you have but how you use it: toward a rational use of simulation to support aviation training. Int J Aviat Psychol. 1998;8(3):197-208.
  4. Tavares W, Brydges R, Myre P, Prpic J, Turner L, Yelle R, et al. Applying Kane’s validity framework to a simulation based assessment of clinical competence. Adv Health Sci Educ Theory Pract. 2018;23(2):323-38.
  5. Chiu M, Tarshis J, Antoniou A, Bosma TL, Burjorjee JE, Cowie N, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anaesth. 2016;63(12):1357-63.
  6. Kester-Greene N, Hall AK, Walsh CM. Simulation curricular content in postgraduate emergency medicine: A multicentre Delphi study. CJEM. 2019;21(5):667-75.
  7. Sherbino J, Bandiera G, Doyle K, Frank JR, Holroyd BR, Jones G, et al. The competency-based medical education evolution of Canadian emergency medicine specialist training. CJEM. 2019:1-8.
  8. Canada RCoPaSo. Entrustable Professional Activities for Emergency Medicine Ottawa: Royal College of Physicians and Surgeons of Canada; 2018.