Teresa Chan: Evaluating the Assessments: A Multi-centre program evaluation of Emergency Medicine Entrustable Professional Assessment Data from 2018-2023

Project Summary/Abstract

In 2018, the Royal College of Physicians and Surgeons of Canada (RCPSC) Emergency Medicine (EM) training program entered the competency-based medical education (CBME) era of training by welcoming our first class of residents to be assessed with the new specialty standards and entrustable professional activities (EPAs).

In this document we outline a national program evaluation plan for tracking EPA achievement with participating programs from across Canada reporting aggregate, anonymized data to allow for a meta-analysis of this data that will support national standard setting.

2.0  Background/Scientific Rationale

Competence By Design is a competency-based national assessment system that was developed by the Royal College of Physician and Surgeons of Canada.1 The RCPSC EM programs began utilizing this assessment system with the 2018 cohort of EM residents.1,2 It requires residents to demonstrate competence in the core domains of their specialty by being assessed on the performance of Entrustable Professional Activities (EPAs).

The EPAs are stage-specific, having been arranged across each of the four stages of training: Transition to Discipline (TD), Foundations (F), Core, and Transition to Practice. Targets have been set for the number of observations of each EPA that each resident should try to obtain within these periods.  These targets were based on expert opinion and calculations factoring in the average number of EM rotations and shifts a trainee would have in a typical FRCPC EM training program.

We anticipate that there will be substantial heterogeneity in the success of the CBD rollout across various institutions. Further, each program varies in the distribution of rotations and number of opportunities for assessment during each stage. As such, we feel that it is important to get a comprehensive view of the feasibility of achieving the suggested number of EPA observations in programs across the country and the degree to which each program required this number of EPAs to be observed prior to promotion. We aim to conduct a multi-centre program evaluation this new assessment system by amalgamating anonymized data from as many of the participating programs as possible. The results will be used to guide continued evolution and program development of the national EM EPA program.

In addition, the literature has also suggested that workplace-based assessments in EM can be prone to gender bias, creating a hidden type of discrimination that must be actively combated by programs and their faculty members.3,4 With the development of a shared database of CBD data it will be possible to explore these and other educational questions at a national level.

3.0 Objectives/Aims

We aim to gather and aggregate anonymized EPA assessment and achievement data for the first 5-cohorts of CBD residents since the CBD launch to determine the following items:

  • Number of EPAs attempted (scores of 1-5 on the O-Score scale5) or achieved by each resident (scores of 4-5 on the O-score5).
  • If there are systematic differences between programs, gender, rotation make-up for acquiring EPAs.
  • If there are any differences in the qualitative data (e.g. word count, quality, content) of the data gathered by each site, gender, rotation
  • This data would be used to inform additional studies to further explore unexpected results.

The primary goal of gathering the data from this analysis will be fed back to our EM specialty committee for improvement of the EPA system at a national scale, but we will reserve the right to present the data at conferences and publish this program evaluation data in journals.

4.0 Eligibility

Data will be collected from all sites that are able to and interested in applying for and receiving a REB exemption for reporting their data for the purposes of program evaluation.

5.0 Study Design and Procedures

There will be two phases of data collection.

Data Collection

  • The data collected will be in the form of self-report surveys (CC Chairs and PDs) and semi-structured interviews wherein they will detail the make-up of their programs, demographics about their residents, perceived systems-level problems with the EPA system of assessment, and uptake of the EPAs. (See appendix A)
  • The data will also be collected in the form of an individualized data extraction form that is set up for each school (see appendix B), which allows the local program to enter the EPA acquisition for each years’ worth of residents into a common spreadsheet according to the scores that the trainees have received at the 5-year mark (i.e. June 30, 2023).
  • The PI and all the members of the research team will have access to the data. The Local PI at each site will govern the data-key that reveals which data line corresponds to which resident, and will be responsible for being in charge of this code-breaking data. Each Local PI will also be responsible for redacting their data set locally for any name-specific data that is sent to our program evaluation site.
  • The data will be kept indefinitely in a password-protected file on an encrypted USB key.
  • Information will be provided to members of the program evaluation team de-identified and coded using a method that cannot be linked to connected back to other personal data.
  • For the purposes of the study, we will link certain subjects to their self-reported usage in order to select the survey participants.

6.0 Expected Risks/Benefits

  • There are no known risks to any of the individuals involved with the exception of a data breach. For this reason, significant efforts will be made to ensure the anonymization of the dataset and security of local data.

7.0 Data Collection and Management Procedures

Quantitative Assessment Data

  • We will ask that each site fill out a version of Appendix B which has been customized to their site.  This will include, the year of the resident’s start, their individuals “attempted” and “completed” ratio for each EPA, the month/year in which they were promoted to the next level.
  • The sites will ensure that this data file has been redacted of personalized information and send this sheet (password protected) back to program evaluation study team.

Survey Data:

  • An annual, prospectively gathered survey will be sent to program directors or CC Chairs, to gather data about the make-up of each cohort’s academic course (e.g. the rotations they have in each year, the number of people per year, the gender split of both the faculty and the residents).
  • We will poll how many shifts the residents are expected to complete per block/month, and also how often they are on home service (EM or PEM) and off service (non-EM rotations).

8.0 Data Analysis

Quantitative:

  • We will use Microsoft Excel to compute simple descriptive statistics of EPA achievement/attempts and a time-series for promotion between levels.
  • We will use Generalizability study to determine if there is any significant difference between sites in terms of their achievement of EPAs, and the contribution of variance in EPA achievement for a number of facets including number of teaching faculty, number of days “on service”, number of days “off service”.
  • We will plan to conduct interim analyses after transition to discipline (TTD), foundations (F), Core (C), and transition to practice (TTP) levels with technical reports outlining raw achievement levels and timelines sent periodically to the specialty committee and participating sites without comparison analyses.

9.0 Regulatory Requirements

9.1 Informed Consent

  • A waiver of consent will be obtained from the research ethics board for trainee data access. All data will be cleaned and redacted at the local sites prior to transfer and analysis, thereby minimizing risks to individuals.
  • There are no conceivable risks with participation. This study is unlikely to have any adverse effects on the individuals and efforts will be taken to ensure that the data cannot be linked back to specific individuals. All measures will be taken to protect the contents of the data set for this study (i.e., password protected files and encrypted USB keys).

9.2 Subject Confidentiality

  • Data used for this study will be de-identified and coded using a method that cannot be linked back to the individual.
  • Only members of the program evaluation team at each site will have access to the un-redacted and full dataset, but these individuals will ALREADY have access to this data.
  • Data contributed by CC Chairs or PDs will not be confidential, but will be amalgamated and redacted for analysis.

9.3 Unanticipated Problems

  • Foreseeably a site may initially sign up to be part of our program evaluation and then be lost to follow-up since they may have a change in leadership or those who have access to the data may change over time. We will ask participants to identify such changes to our study evaluation team to ensure a smooth transition.
  • The EPAs may change over time, and therefore it will be important to continue monitoring changes to the program on a whole .

10.0 References

  1. The Royal College of Physicians and Surgeons of Canada. EPAs and milestones. http://www.royalcollege.ca/rcsite/cbd/implementation/cbd-milestones-epas-e. Accessed November 22, 2019.
  2. Entrustable Professional Activities. The Royal College of Physicians and Surgeons of Canada website. http://www.royalcollege.ca/rcsite/cbd/implementation/cbd-milestones-epas-e. Published 2019. Accessed February 18, 2019.
  3. Mueller AS, Jenkins T, Osborne M, Dayal A, O’Connor DM, Arora VM. Gender Differences in Attending Physicians’ Feedback for Residents in an Emergency Medical Residency Program: A Qualitative Analysis. J Grad Med Educ. 2017;(October):577-585. doi:10.4300/JGME-D-17-00126.1.
  4. Dayal A, O’Connor DM, Qadri U, Arora VM. Comparison of Male vs Female Resident Milestone Evaluations by Faculty During Emergency Medicine Residency Training. JAMA Intern Med. 2017;177(5):651. doi:10.1001/jamainternmed.2016.9616.
  5. Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation ( O-SCORE ): A Tool to Assess Surgical Competence. 2012;87(10):1401-1407. doi:10.1097/ACM.0b013e3182677805.

Leave a Reply

Your email address will not be published. Required fields are marked