Download

Original Research

General Internal Medicine (GIM): do the Puzzle Pieces Portray the Picture? A Continuous Quality Improvement Process for Entrustable Professional Activities (EPAs)

Samantha Halman1, Laura Marcotte2, Michelle Elizov3, Lynfa Stroud4, 5, Jolanta Karpinski6, Sharon E. Card7*

1Department of Medicine, University of Ottawa and The Ottawa Hospital, Ottawa, ON, Canada;

2Department of Medicine, Queen’s University, Kingston, ON, Canada;

3Department of Medicine, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada;

4Department of Medicine and Wilson Centre, University of Toronto, Toronto, ON, Canada;

5Sunnybrook Health Sciences Centre, Toronto, ON, Canada;

6Royal College of Physicians and Surgeons of Canada and the University of Ottawa, Department of Medicine, Ottawa, ON, Canada;

7Department of Medicine, University of Saskatchewan and the Saskatchewan Health Authority, Saskatoon, SK, Canada

Abstract

Defining General Internal Medicine (GIM) has been difficult due to the tension between ensuring flexibility for varied environments and the need for national standards. With the launch of competency-based medical education, the Royal College of Physicians and Surgeons of Canada Specialty Committee in GIM (SCGIM) (national standard-setting body) had the opportunity to explicitly define the discipline via elaboration of the GIM competencies and Entrustable Professional Activities (EPAs). Defining the EPAs is the essence of defining the tasks of the discipline. We describe our SCGIM approach to the continuous review of the theoretical written documentation around EPAs in the “real world environment” in order to continuously refine the EPAs and ensure they are facilitating skill attainment. Major lessons learned (1) centralized feedback with simple reporting and multiple input is best; (2) there is tension between theory (perfect EPAs) and practical implementation; (3) it takes time to see how the EPAs are performing.

Résumé

Il a été difficile de définir la médecine interne générale (MIG) en raison de la tension entre la nécessité d’assurer la souplesse pour des environnements variés et le besoin de normes nationales. Grâce au lancement de la formation médicale par compétences, le comité de spécialité en MIG (CSMIG) du Collège royal des médecins et chirurgiens du Canada (organisme national de normalisation) a eu l’occasion de définir explicitement la discipline en élaborant les compétences en MIG et les activités professionnelles confiables (APC). Définir les APC consiste essentiellement à définir les tâches de la discipline. Nous décrivons notre approche du CSMIG de l’examen continu de la documentation théorique écrite concernant les APC « en situation réelle » pour améliorer continuellement les APC et veiller à ce qu’elles facilitent l’acquisition des compétences. Principales leçons apprises : a) la rétroaction centralisée avec des rapports simples et des commentaires multiples convient le mieux; b) il existe une tension entre la théorie (APC parfaites) et l’application dans la pratique; c) il faut du temps avant de constater le rendement des APC.

Key words: Entrustable professional activities, quality assurance

Corresponding Author: Sharon E. Card: sharon.card@usask.ca

Submitted: 1 October 2021; Accepted: 1 April 2022; Published: 11 June 2022

Doi: http://dx.doi.org/10.22374/cjgim.v17i2.580

What’s in a Name? That Which We Call General Internal Medicine by Any Other Name Would Be Viewed as Complex

The Royal College of Physicians and Surgeons of Canada (Royal College) defines General Internal Medicine (GIM) as “a subspecialty of Internal Medicine (IM) which encompasses the values of generalism and is characterized by its breadth of clinical activity and alignment of practice profile with health needs of local populations.”1 Until 2010, GIM was not recognized as a distinct subspecialty of IM; yet through the strong advocacy of dedicated GIM educators, the discipline matured educationally and administratively with many Canadian Universities tailoring individualized training experiences for residents choosing to focus on generalism.2 This led to a wide breadth of training programs albeit without formal recognition or paths to licensure. One of the key challenges in defining GIM as a subspecialty of IM was the perceived infinite variety of possible scopes of practice within the field.2 Although much emphasis was subsequently placed on defining the key characteristics and objectives of training1 common to the practice of GIM, the inherent flexibility in training and practice models was never forgotten. GIM has been the most popular choice of disciplines amongst all IM subspecialty programs for several consecutive years with over 19% of all applicants selecting it as their first choice in 2021.3

Creating a General Internist–What Abilities do Graduates Need?

The goal of residency education is to create practitioners who can provide excellent health care, within the scope of their discipline, that meets the needs of the population they serve. Although this is a laudable goal, it is challenged practically within each discipline by the need to be flexible and by allowing graduates to tailor their learning to future contexts while at the same time ensuring a national standard.

The patient population we serve in GIM is complex and multifaceted; and as such, our training models also need to be despite the mandated structure of a Royal College-designed subspecialty. To inform the development of the first Objectives of Training Requirements (OTR) of the discipline,1 the GIM community conducted an in-depth needs assessment of GIM graduates,4,5 identified gaps in training and created a framework for training based on resolving tensions between standardization and embracing the diversity of the field. The first cohort of Royal College certified GIM trainees graduated in 2014. At the same time, the medical education world was met with a new challenge – migrating from the traditional time-based curricula to the one focused on competency-based education. In Canada, the framework adopted by the Royal College is Competence by Design (CBD)6; the first two Royal College disciplines launched in 2017. GIM training programs launched CBD in 2019.

Within CBD, successful completion of training is no longer measured in months or years spent within a discipline but rather by the achievement of competencies specific to the discipline. Training is sequenced in stages with progressing complexity and responsibility over the training period, and the resident’s progress in achieving the competencies of that stage is supported and monitored by the residency program. With CBD, the discipline is defined (as per Englander et al.) both in terms of competencies “The array of abilities (knowledge, skills, and attitudes) across multiple domains or aspects of performance in a certain context)” and Entrustable Professional Activities (EPAs) which are “An essential task of a discipline (profession, specialty or subspecialty) that a learner can be trusted to perform without direct supervision, and an individual entering practice can perform unsupervised in a given health care context, once sufficient competence has been demonstrated.”7 The discipline is described fully when visualized through the totality of the Competencies and EPAs. EPAs describe the outcomes to be achieved within each stage and are used as a focus for teaching, learning, feedback on performance, and documentation of achievement. EPAs represent sentinel tasks that are prioritized for observation, coaching, and assessment. EPAs are defined as key tasks of the discipline that a trainee can be trusted to perform independently once sufficient competence has been demonstrated.7 Similar to the rigorous process previously undertaken to define GIM, its core principles and objectives of training (Figure 1), in-depth assessments, and consultations with the GIM community were initiated to define our EPAs. The key overarching concepts and key elements illustrating the foundation of GIM (represented graphically in Figure 2) were kept at the forefront of EPA development.

Figure 1. Evidence used to inform GIM objectives of training and GIM EPAs2,4,5,814

Figure 2. Pictorial illustration of tasks of a general internist (EPAs)23,24

The development of EPAs is considered a task of each specialty committee (national standard-setting body) of the Royal College relevant to that discipline. Through a facilitated process with the support of a clinician-educator from the Royal College, the Specialty Committee in GIM (SCGIM) took on consolidating the breadth of data and creating our structured CBD framework.

As the Royal College embarked on CBD, this was an opportunity for the discipline of GIM to:

  1. Be explicit as to the definition of GIM within the Competencies.

  2. Define the tasks of the discipline within the Entrustable Professional Activities (EPAs).

  3. Develop a process to facilitate an ongoing review of whether residency education is meeting those goals.

As GIM became a subspecialty of Internal Medicine defining the graduate abilities as distinct from Internal Medicine specialty training was a focus for both the Competencies and EPAs. As one step towards seeking to see if the discipline is meeting the needs of graduates and hence society, we have embarked on a process of quality review of our theoretical EPAs to see if they are functioning as they should – i.e. remaining sentinel tasks of the discipline. We illustrate key features of our process that we believe are applicable to other disciplines.

The SCGIM is composed of regional, community, and specialty society representation from diverse practice profiles across Canada as well as representation from the Division Directors in GIM and all the GIM Program Directors (PDs). The GIM PDs are responsible for the implementation of the educational construct of CBD, including the EPAs, into their varied institutional circumstances. Throughout multiple sessions (in person and virtually) started in 2016, an iterative review process was used to develop the final GIM EPAs.

Refining GIM EPAs Prior to GIM Launch of Competence by Design in July 2019

The majority of research on EPAs is on their development with relatively little attention paid to the implementation phase and whether they “do their job.”15,16 Practically speaking, for the residents and faculty on the ground, there is little interest in knowing “how” EPAs were developed but considerable concern about how they fit into busy clinical lives and whether they reflect the realities of clinical practice. Recent work has illustrated the importance of language uniformity and mutual understanding of concepts central to EPAs, the need to ensure EPAs lead to meaningful learning and assessment that applies to the area of practice, and the importance of narrative feedback.1720

Despite a rigorous background of evidence and expertise informing the development of the EPAs, it was anticipated that implementation may find that some EPAs would not perform as well as anticipated in the “real world” and/or needed to be revised.21 This said, there are no guidelines to direct national standard-setting bodies as to how to audit and refine EPAs. The SCGIM felt strongly that an ongoing process of quality assurance and improvement was central to the success of EPA implementation across the discipline. Over 18 months prior to the official national CBD launch of GIM programs in July 2019, the SCGIM led a series of EPA pilots and iterative reviews. Although the original intent was to finesse the wording of individual EPAs, the goal evolved over time to “develop a standardized process to allow programs to provide routine feedback to the SCGIM as to the performance of the EPAs as they are implemented.”

The Role of the Specialty Committee in Refining EPAs

Within the purview of the specialty committee, it was important that we create national standards of training that align with the scope and diversity of GIM and prepare graduates to enter independent practice. With that in mind, we aimed to (1) define the tasks of the discipline (EPAs) accurately; (2) write the EPAs in a language that is intuitive and made sense to end-users, and (3) ensure a shared mental model about the intent of an EPA. However, the implementation of those national standards into training will vary due to differences at the program, institutional, and regional levels which may impact the manner and perhaps even feasibility by which the residents attain the required skills. The SCGIM felt that it was important to be involved in the continuing quality improvement process of EPAs beyond the initial development phase, both to gather information on the original design as well as to monitor the fidelity of implementation.

Piloting EPAs

Principles of Utilization Focused Evaluation22 was used to sequentially devise pilots, collate results, present them to the group, revise the pilot process, and repeat. Three cycles of pilots occurred. At each step, the next planned pilot was developed by the group based on the needs identified in the previous step. The pilots are summarized in Figure 3.

Figure 3. Summary of the EPA pilots

The initial pilot involved all programs fully implementing a singular pre-determined EPA and providing feedback using a detailed standardized assessment template designed for this purpose. EPAs and reporting templates were available in French and English. A total of 61 reporting templates were analyzed. From this review, it became clear that focusing on individual components of an EPA was less useful than focusing on the EPA as a whole and ensuring that all users had a shared mental model. For example, an EPA could be interpreted as being based on the unit of “per patient” versus “per clinic.” Similarly, assessors may interpret a rating of “not observed” to mean that something was not relevant to the patient encounter versus missing or lacking in the resident’s performance; this differing interpretation may lead to a different overall rating of the resident’s performance. This original pilot raised awareness that the original EPAs as presumed by the writers may be interpreted differently by frontline users/supervisors. The SCGIM consensus was that discussing differences in interpretation and sharing tips to unify the understanding of the EPAs was more useful than general data analytics.

For the second pilot, the SCGIM had two goals: to develop an understanding of how several EPAs were operationalized at each school and evaluate a larger number of EPAs. The SCGIM developed a more streamlined reporting form that allowed programs to give a global assessment as to the functioning of several EPAs as opposed to the initial more structured reporting in pilot one around individual data points. Although an abundance of data was collected, this was felt to be too cumbersome for programs as a real-world auditing and feedback progress. The program directors valued giving more open-ended feedback, but the process needed to be considerate of the administrative workload added to programs. The SCGIM consensus was that reviews needed to be simple and feasible while recognizing the valuable input of those implementing the EPAs.

Finally, in our third pilot, the reporting form was further refined, and each school was assigned a few EPAs to implement and collect data on with redundancy created in the assignment. Data were aggregated into three categories: things to change; things to monitor; and suggestions for implementation. At the time of the pilot, many of the EPAs had not been extensively used in the training program. However, despite that limitation and based on the findings of the pilot, the SCGIM made seven changes to the existing EPAs prior to the official implementation of CBD in GIM. From this pilot, we learned that some of the data supervisors were asked to collect about the nature of the specific patient encounter in which the EPA was observed (e.g., GIM clinic vs Other Clinic) were most likely to be problematic and did not provide added educational value. Although these questions were initially added with the intent to ensure a variety of experiences, the real-world experience reflected only unnecessary complexity. Further, this pilot taught us that nationally gathered feedback needs not to be fully inclusive or exhaustive but rather focused on big actions such as things needing urgent change and identifying areas to monitor. It was also reinforced that sufficient use of the EPAs in the clinical environment was needed before specific feedback could be provided. Suggestions for implementation were shared amongst all programs and added to a “memory data bank” for future changes.

All pilot summaries were discussed within the SCGIM. Over 2 years a consensus as to what is most important in the process for ongoing quality review of EPAs was developed (Box 1, Appendix B). As well consensus was built that full-scale change revisions were not wanted frequently but instead, after at least one and preferably two cohorts of residents had experienced the EPAs to give time to truly see how they are functioning. We are continuing to collect feedback on the EPAs on an ongoing basis.

Take Home Points from Pre-implementation EPA Pilots

The current GIM EPAs are in Appendix A.23 Utilizing the lessons learned in the pilots (see Box 1), we have refined the future quality improvement for EPAs into the collection of a regular “Report Card.”

Lessons learned in developing a process for the ongoing quality improvement for EPAs

Utilize a simple feedback format to seek input from those who oversee implementation of EPAs (the Program Directors and the program leadership).
Obtaining feedback on the global overview is more important than collating granular feedback. This includes ensuring a shared understanding of the intent of the EPA and sharing tips for implementation.
Develop categorical recommendations including the urgency to implement an educational change.
Create a “memory data bank for future changes” and identify any urgent changes needed to minimize change fatigue.
Explicit review of complexity of the EPA categories – example data about the nature of the patient encounter where variables were often added to try to ensure a variety of experiences but often added educationally unnecessary complexity.

Next Steps

There is a tension between wanting to ensure “perfection” of document suites and avoiding frequent change which can be problematic at the program level. To understand the performance of an EPA (in the absence of major concern), it takes time and the SCGIM agreed that several iterations and multiple uses of the EPA should be observed prior to changing the national standards. Discussion amongst the committee re-iterated that everyone’s context is different; an individual program’s readiness for change will vary depending on the level of institutional support, the timing of accreditation reviews, and the program’s culture of observation. There is of course no “right balance,” but this tension is important to recognize in proceeding with document review and changes.

The national specialty committee should and can provide ongoing quality reviews of the discipline’s EPAs in a structured way. In the future, we need to work to understand whether the feedback received about EPAs is a result of challenges that are faced by programs because of the change to CBD, a result of the way the EPAs are written, or a result of the EPAs being “stretch goals” for programs that are not yet sufficiently evolved to meet the new training requirements.

When GIM embarked on its original journey to develop training standards, the focus on diversity and the challenge in developing training standards to meet that diversity became apparent. As we continue reviewing the quality of the EPAs, our long-term goal is to assess how well they prepare residents for practice and create graduates with the skills needed for practice. Training standards may well raise the bar, for example, more teaching around obstetric medicine, and for this reason, our EPA quality review process will continue to focus not just on “what is well written” but also on whether the EPA rollout is promoting future enhancement to training when those “stretch goals” are within societal needs.

In the future, we look forward to evaluating whether the GIM EPAs truly reflect practice and whether they prepare GIM residents for the diversity of continually evolving GIM practice. This ongoing monitoring of the fidelity of implementation of CBD will assist in mitigating any unintended consequences and ensuring that the educational design facilitates well-prepared graduates matched to societal needs. At the same time, we are cognizant that the discipline is reflected more broadly within the Competencies and a parallel study is examining the preparation and needs of graduates around all competencies for GIM. Through these studies, GIM seeks to ensure that future graduates are well prepared to meet the needs of society in their careers. This includes not only the tasks they must perform but the ongoing flexibility and adaptability that is needed in a career in GIM to remain responsive to the needs of the local patient population.24 This ongoing strengthening of the subspecialty of GIM ensures that the training programs meet the needs of GIM graduates above and beyond training obtained in the specialty of Internal Medicine.

Summary

While fully acknowledging that there are aspects of EPA implementation that determine their educational effectiveness, we do believe that they can be used to define desired outcomes of training and illustrate the practice of a discipline. The national standard-setting body plays a crucial role in developing and refining EPAs. Like the research highlighting the importance of language and mental model uniformity17 and meaningfulness,18 our process has shown the qualitative review of EPAs at a national level has been more useful than individual data analytics. Local context, institutional factors, local stakeholder involvement, and individualized frameworks all contribute to the landscape of EPA success and make data analytics challenging and less fruitful than narrative processes. As our subspecialty evolves, it will be important to ensure that EPAs continue to capture the tasks of GIM practice. With the emergence of areas of need such as addictions medicine and skills such as point-of-care ultrasound, it is anticipated that some EPAs will disappear, some will be refined, and new ones will be added.

Acknowledgments

With acknowledgment to all GIM Specialty Committee members and workshop participants from 2016 to 2021.

REFERENCES

1. General Internal Medicine objectives of training in the subspecialty of general internal medicine. 2012 Version 1.0. Royal College of Physicians and Surgeons of Canada. National Standards for training prior to July 1, 2019. Available at: https://www.royalcollege.ca/rcsite/documents/ibd/general_internal_medicine_otr_e.pdf Accessed January 6, 2022.

2. Card SE, Clark HD, Elizov M, Kassam N. The evolution of General Internal Medicine (GIM) in Canada: International implications. J Gen Intern Med. 2017;32(5):576–81. 10.1007/s11606-016-3891-z

3. MSM Data and Reports. Canadian Residency Matching Service. Available at: https://www.carms.ca/data-reports/msm-data-reports/. Accessed May 18, 2022.

4. Card SE, Snell L, O’Brien B. Are Canadian General Internal Medicine training program graduates well prepared for their future careers. BMC Med Educ. 2006;6(1):1–9. 10.1186/1472-6920-6-56

5. Card SE, PausJenssen AM, Ottenbreit RC. Determining specific competencies for General Internal Medicine residents (PGY4 and 5). What are they and are programs currently teaching them? A survey of practicing Canadian General Internists. BMC Res Notes. 2011;4(1):1–6. 10.1186/1756-0500-4-480

6. Competency by Design. Royal College of Physicians and Surgeons of Canada. 2022. Available at: https://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e. Accessed May 18, 2022.

7. Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L on behalf of the ICBME Collaborators. Toward a shared language for competency-based medical education. Med Teach. 2017;39(6):582–7. 10.1080/0142159X.2017.1315066

8. Anderson L, Ward HA, Card SE. Linking General Internal Medicine residency training to human resource needs and roles in a changing health landscape. Univ Saskatchewan Undergrad Res J. 2015;1(2):1–7. 10.32396/usurj.v1i2.105

9. Card SE, Ward HA, Broberg L. Preparing General Internal Medicine residents for the future–Aiming to match training to need–A pilot study in Saskatchewan. CJGIM. 2016;11(2):26–30. 10.22374/cjgim.v11i2.145

10. Cavalcanti RB, Hendricks AC, Card SE. Procedural skills of a general internist–Informed by the front line. CJGIM. 2017;12(3):8–12. 10.22374/cjgim.v12i3.163

11. Cumyn A, Card SE, Gibson P. Education research–GIM. CJGIM. 2019;14(3):23–9. 10.22374/cjgim.v14i3.322

12. Shah R, Melvin L, Cavalcanti RB. EPAs for the ambulatory internist in translation: Findings from a Canadian multi-center survey. CJGIM. 2019;14(3):9–15. 10.22374/cjgim.v14i3.317

13. Access Data and Reports. The Canadian Institute for Health Information. Available at: https://www.cihi.ca/en/access-data-and-reports. Accessed May 18, 2022.

14. Public Health Agency of Canada. 2022. Available at: https://www.canada.ca/en/public-health.html. Accessed May 18, 2022.

15. Shorey S, Lau TC, Lau ST, Ang E. Entrustable professional activities in health care education: A scoping review. Med Educ. 2019;53(8):766–77. 10.1111/medu.13879

16. Post JA, Wittich CM, Thomas KG, et al. Rating the quality of entrustable professional activities: Content validation and associations with the clinical context. J Gen Intern Med. 2016;31(5):518–23. 10.1007/s11606-016-3611-8

17. Melvin L, Rassos J, Stroud L, Ginsburg S. Tensions in assessment: The realities of entrustment in internal medicine. Acad Med. 2020;95(4):609–15. 10.1097/ACM.0000000000002991

18. Hatala R, Ginsburg S, Hauer KER, Gingerich A. Entrustment ratings in internal medicine training: Capturing meaningful supervision decisions or just another rating? J Gen Intern Med. 2019;34(5):740–3. 10.1007/s11606-019-04878-y

19. Marcotte L, Egan R, Soleas E, Dalgamo N, Norris M, Smith C. Assessing the quality of feedback to general internal medicine residents in a competency-based environment. Can Med Educ J. 2019;10(4):e32–47. 10.36834/cmej.57323

20. Gofton W, Dudek N, Barton G, Bhanji F. Workplace-based assessment implementation guide: Formative tips for medical teaching practice. 1st ed. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2017, p. 1–12.

21. Hall AK, Rich J, Dagnone JD, Weersink K, Caudle J, Sherbino J, et al. It’s a marathon, not a sprint: Rapid evaluation of competency-based medical education program implementation. Acad Med. 2020 May;95(5):786–93. 10.1097/ACM.0000000000003040

22. Patton MQ. Utilization-focused evaluation. 4th ed. SAGE Publications, Thousand Oaks, CA; 2008.

23. General Internal Medicine Specialty Committee. EPA guide for General Internal Medicine. Ottawa: Royal College of Physicians and Surgeons of Canada; 2019.

24. General Internal Medicine Specialty Committee. General Internal Medicine competencies. Ottawa: Royal College of Physicians and Surgeons of Canada; 2018.

Appendix A. GIM EPAs summary as of 202123

Transition to discipline
TTD # 1 Assessing and proposing management for patients with common internal medicine presentations.
TTD # 2 Assessing, resuscitating, and providing initial management for patients with acute, unstable medical presentations.
Foundations of discipline
FoD # 1 Applying the GIM approach to the assessment and initial management of patients with any general internal medicine presentation in the acute care setting.
FoD # 2 Applying the GIM approach to the ongoing management of patients with common acute general internal medicine presentations.
FoD # 3 Assessing and providing initial management for patients with common presentations in an outpatient clinic.
Core of discipline
Core # 1 Applying the GIM approach to the ongoing management of complex patients with acute general internal medicine presentations.
Core # 2 Applying the GIM approach to the management of patients with any general internal medicine presentation in the outpatient setting.
Core # 3 Assessing and managing perioperative patients.
Core # 4 Assessing and managing pregnant patients with common or emergent obstetrical medical presentations.
Core # 5 Assessing and counselling women of reproductive age with common chronic general internal medicine conditions.
Core # 6 Providing preventive care and health promotion.
Core # 7 Providing care for patients with end-stage disease.
Core # 8 Stabilizing patients who are critically ill and providing or arranging definitive care.
Core # 9 Documenting clinical encounters.
Core # 10 Leading discussions with patients, their families, and/or other health care professionals in emotionally charged situations.
Core # 11 Providing interpretation of cardiac and respiratory diagnostic tests.
Core of discipline
Core # 12 Leading a GIM inpatient team.
Core # 13 Leading a GIM consultation service and/or team.
Core # 14 Managing a longitudinal clinic.
Core # 15 Teaching, coaching, and assessing learners in the clinical setting.
Core # 16 Advancing the discipline and/or patient care through scholarly activity.
Core # 17 Assessing and managing patients in whom there is uncertainty in diagnosis and/or treatment.
Core # 18 Planning and completing personalized training experiences aligned with career plans and/or specific learning needs.
Core # 19 Performing the procedures of General Internal Medicine.
Transition to practice
TTP # 1 Managing a GIM case load/practice.
TTP # 2 Developing a personal learning plan for future practice and ongoing professional development.

© 2018 The Royal College of Physicians and Surgeons of Canada. All rights reserved.

This document may be reproduced for educational purposes only provided that the following phrase is included in all related materials: Copyright © 2018 The Royal College of Physicians and Surgeons of Canada. Referenced and produced with permission. Please forward a copy of the final product to the Office of Specialty Education, attn: Associate Director, Specialties. Written permission from the Royal College is required for all other uses. For further information regarding intellectual property, please contact: documents@royalcollege.ca. For questions regarding the use of this document, please contact: credentials@royalcollege.edu.

Appendix B. Questions within report card for epa quality review

Is this EPA relevant to GIM practice?
Does the EPA facilitate feedback that results in resident performance improvement?
Does the EPA promote narrative?
Are any milestones redundant, to be deleted, or reworded?
Any suggestions for contextual variables (drop-down boxes on Form 1)?
Any barriers to completion of the number suggested to be achieved?
Suggestions to other programs when implementing.
Any translation concerns?
Any other comments/reflections on the EPA.
Any comments on the process of submitting feedback about EPA or methods that would make that process more efficient for you.