|Jeffrey N. Love, MD, MSc, MHPE||MedStar Georgetown University Hospital, MedStar Washington Hospital Center, Department of Emergency Medicine, Washington, District of Columbia|
|Anne M. Messman, MD||Wayne State University, Sinai-Grace Hospital/Detroit Receiving Hospital, Department of Emergency Medicine, Detroit, Michigan|
|Chris Merritt, MD, MPH, MHPE||Alpert Medical School of Brown University, Rhode Island Hospital/Hasbro Children’s Hospital, Department of Pediatric Emergency Medicine, Providence, Rhode Island|
A few years after completing residency, this author took a job at a new institution as an assistant program director. It seemed that while the residents were very enthusiastic and eager to learn, they were regularly faced with a dilemma: whether or not to complete their weekly reading assignment. Since all residents were given identical weekly reading assignments, the topics were not always pertinent to what they were doing clinically. As an example, a resident could be on an obstetrics and gynecology rotation but the assigned reading could be on trauma. The resident often had to choose between reading about their patients to prepare for their clinical work, or completing the assigned reading for the week.
As an enthusiastic new faculty member, this author saw a problem and wanted to fix it. The goal was to create a system where everyone was on their own unique reading schedule, and their reading assignments would be coupled with their clinical work. Tempted to create a new reading curriculum and immediately implement it, the author recalled the prior teaching of medical education mentors: one must be deliberate about the changes made and should study the effects of those changes.
The concept of pairing a resident’s clinical work with their reading assignments is supported by Kolb’s theory of experiential learning.16 Kolb argued that effective learning occurs as learners’ cycle through four stages: concrete experience, active experimentation, reflective observation, and abstract conceptualization. In the case of the reading curriculum, the plan was for the residents to have the concrete experience of readings relevant to what they were experiencing clinically, allowing them to actively experiment and reflect on the application of this newly acquired knowledge. Kolb’s theory suggests that this process would lead to more effective learning, which was the goal in changing the way reading was assigned to the residents.
In addition to ensuring that the curricular intervention was grounded in educational theory, the author was deliberate in her intention to evaluate the changes made. The Kirkpatrick outcome levels were very helpful in guiding the project toward these evaluation goals.9 To begin with, level 1 outcomes were evaluated having to do with the learner’s satisfaction with the curricular change. To evaluate this, the residents reported their level of satisfaction with the curriculum both before and after the curricular intervention. Level 2 outcomes measure the learner’s acquisition of knowledge. The plan was to improve the learners’ acquisition of knowledge by improving their reading schedule. Objectively measuring this outcome will be a long-term goal, perhaps accomplished by evaluating changes in in-training examination scores or some other objective test of knowledge. Level 3 outcomes have to do with changes in learners’ behavior. Consequently, the amount of time residents spent reading before and after the intervention was evaluated. It was found that they spent significantly more time reading after the intervention; thus, their behavior (i.e., motivation to read) appeared to have changed. Although levels 1 and 3 outcomes were measured, they were both subjective in that they were self-reported by the residents. As a curricular innovation based on education theory, the author was able to publish her work based on these preliminary findings.17 There are plans in place for more rigorous, objective outcomes to evaluate the value of this initiative to the residents’ education.
In an effort to create a more compelling presentation, one of us took a previously well-received presentation on anterior segment ophthalmologic trauma and used educational theory and best practices to redesign it. This was done with the goals of improving participants’ learning and retention. Successful treatment of such traumas is largely dependent upon proper diagnosis based on pattern recognition, otherwise known as non-analytical clinical reasoning.18 Much like electrocardiograms,19 skin disorders,20 and radiographs,21 the more examples one sees, the more proficient one becomes. Expertise is based on building a repertoire of examples against which future experiences can be compared. Consistent with the conceptual basis of non-analytical reasoning, this initiative was designed to maximize participants’ cognitive schema/catalog development through spaced repetition involving specific anterior-segment injuries.22
Consistent with Kolb’s experiential learning theory,16 the structure of this experience was designed to provide staged, multifaceted, authentic learning experiences to maximize the breadth and depth of what was learned. Additional pedagogical principles were employed in the appropriate circumstances.
To activate learning, 10 pictures of specific pathological entities, each with clinically related questions, were sent to participants to complete and return one week prior to the presentation.
II. Presentation (one hour)
The objective was to create a clinical need to know followed by a case-based, concrete learning experience promoted through active learning techniques such as “think-pair-share” and “the one-minute paper.”23 This was carried out in a safe environment that facilitated participant decision- making and engagement in the experience. To view the presentation go to: https://chipcast.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=3a0780f2-0218-457b-a9f4-fb9c9ba493a4.
III. Workshop (one hour)
To build on prior experiences, small-group sessions were designed to promote learning from one another (social learning theory).24,25 Each group received several case scenarios with pictures and questions related to actual complex presentations. In a problem-based learning exercise, each group worked through the cases provided, including (1) What are your initial concerns; (2) How will you proceed with evaluation; (3) What treatment options would you consider; and (4) What do you believe will be the final disposition of this patient?
Ten pictures involving eye trauma not previously seen, each with clinically relevant questions, were sent to participants to complete one week after the presentation. The purpose of this exercise was to provide additional examples for schema development and retention, not to test participants’ post-program knowledge.
Though there were generally multiple examples of most of the entities presented, high-risk presentations such as hyphemas (eight examples) and alkaline burns (five examples) were emphasized to maximize schema formation regarding these high-risk entities.
Participants’ post-program assessment of the experience was very positive (Kirkpatrick level 1), particularly in regard to learning clinically relevant information (Kirkpatrick level 2). Most reported specific, intended changes in their practice as a result of the experience (Kirkpatrick level 3). Though generally not substantive enough for scholarly work, such subjective “soft outcomes” are appropriate for evaluating and improving this type of experience. From a personal perspective, trying something new and experiencing an increased level of engagement from participants was both energizing and fulfilling as an educator.
In 2014, our emergency medicine (EM) residency was faced with redesigning the pediatrics experience for our residents. The traditional model involving EM residents rotating “off-service” on pediatric inpatient wards had proved ineffective and unpopular. Residents voiced their concern that this learning environment did not resemble their intended practice. Significant time was spent in administrative and other tasks that were not germane to the practice of EM, and the educational benefit of the “ward rotation” was not perceived as being worthy of this opportunity’s cost. While there was certainly educational opportunity in the pediatric setting that could be valuable to EM residents, it was not clear how precisely to best achieve this benefit.
The decision was made to redesign the experience through the lens of situated learning – the conceptual framework in which learners are welcomed into a community of practice by participating in authentic work, connecting this experience with prior knowledge, and developing relationships with other professionals.25 By situating the learning experience in the authentic care of patients in the emergency department (ED), our hope was that residents would begin to understand the importance of pediatric care in the ED. Residents then followed up on all of their admitted patients during a mentored follow-up experience, rounding on the inpatient wards with a pediatric EM attending physician. In this manner, learners could further their understanding of both the longitudinal outcomes of hospitalization, and better recognize the role of the emergency physician (EP) in the continuum of pediatric care.
Framing our intervention in the situated learning framework, we used a common curriculum development rubric described by Kern et al.26 to develop, design and study the new curriculum. Residents, faculty and alumni were surveyed to identify both general and specific needs related to pediatric care in the ED as well as to gain a perspective on the course of pediatric illness and injury. From this needs assessment, we developed goals and objectives for the new curriculum. These learning objectives then guided the selection of appropriate educational methods and strategies. Once implemented, the identified goals and objectives could then guide evaluation strategies by which to determine effectiveness of the curriculum. With a fresh set of intended outcomes focused specifically on application of pediatric knowledge for the EP, we had the means to evaluate whether the curriculum was achieving its goals.
Beyond designing a curriculum, however, the basis in conceptual frameworks and use of a systematic process to guide the development of goals and objectives had the welcome benefit of allowing our team to communicate our experiences to other educators – even those beyond our specialty. Speaking in the language of educational theory and outlining our goals, objectives and outcomes in the Kern framework, our work became more scholarly. We presented our findings at academic meetings and were able to publish our experience for others to read, critique, and build upon.27 To demonstrate the impact of our curriculum to the residency program and medical directorship, we focused on the learners› satisfaction (Kirkpatrick level 1) and on their perceptions regarding positive impact on clinical care in the ED (admittedly a lower-level learning outcome). Residents also reported self-assessed changes in knowledge (Kirkpatrick level 2). To demonstrate the impact of this initiative on behavior (Kirkpatrick level 3), we used self-reported, retrospective post-then-pre surveys28 in addition to direct observation using a standardized assessment tool based on entrustable professional activities to provide a more objective, higher-level outcome. Without a firm basis in educational theory, this project would have remained a closed process, perhaps locally successful but surely not generalizable beyond our institution. Depending on the outcome, other curricula may require more objective data, such as tests of knowledge acquisition or changes in behavior, to reach the threshold of scholarly educational innovation. Incorporating education theory from the start, we could improve our product (the curriculum itself) and our scholarly impact.
Our hope is that these three examples will assist those who are interested in making their next educational intervention more evidence based. In addition to the references provided, a “toolbox” of potential resources has been included (Table 2) to facilitate the development of evidence based initiatives and achieving scholarly results.
|Best Evidence Medical Education (BEME)||https://www.bemecollaboration.org/Publications+Evidence+Based+Medical+Education/||The BEME Collaboration is an international group of individuals, universities and professional organizations committed to the development of evidence-informed education in the medical and health professions.|
|“Twelve tips” series||Website for Medical Teacher: https://www.tandfonline.com/loi/imte20||This series provides practical advice in the form of 12 short hints or tips on medical education topics of interest (ex. evaluating educational programs, flipping the classroom).|
|ALiEM Academic Primer Series||https://www.aliem.com/2017/06/academic-primer-series-curated-collections-for-educators/||Collection of nine narrative reviews on important medical education topics, highlighting the most important literature and their defined importance for junior educators and faculty developers.|
|Curated Collections for Educators||“Five Key Papers” series – published in both WestJEM and Cureus||This series provides the five most important papers on specific topics of importance in medical education. Topic examples include educational scholarship in junior academic faculty and digital scholarship.|
|International Clinician Educators (ICE) Blog||https://icenetblog.royalcollege.ca||This blog promotes discussion among clinician educators from around the world, archiving a variety of education resources.|
|Key Literature in Medical Education (KeyLIME) podcast||https://icenetblog.royalcollege.ca||This is a weekly podcast produced by the Royal College of Physicians and Surgeons of Canada that provides a summary and analysis of a medical education article in under 30 minutes.|
|AMEE Guides||https://amee.org/publications/amee-guides||AMEE Guides cover topical issues in medical and healthcare professions education and provide information, practical advice and support.|
|ALiEM Education Theory Made Practical eBooks||https://www.aliem.com/2017/08/education-theory-made-practical-volume-1/||This is a free, peer-reviewed eBook that explains educational theories and how they can be integrated into educational practice.|
ALiEM, Academic Life in Emergency Medicine; WestJEM, Western Journal of Emergency Medicine; AMEE, Association for Medical Education in Europe.
Section Editor: Douglas S. Ander, MD
Full text available through open access at http://escholarship.org/uc/uciem_westjem
Address for Correspondence: Jeffrey N. Love, MD, MSc, MHPE, MedStar Georgetown University Hospital, MedStar Washington Hospital Center, Department of Emergency Medicine, 5800 Reservoir Rd, NW, Washington, DC 20007. Email: Jlove01@georgetown.edu. 1 / 2019; 20:1 – 5
Submission history: Revision received October 18, 2018; Accepted October 18, 2018
Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study. There are no conflicts of interest or sources of funding to declare.
1. Frank JR, Snell LS, Ten Cate O, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-45.
2. Cooke M, Irby DM, O’Brien BC. Educating physicians: a call for reform of medical school and residency. J Chiropr Educ. 2011;25(2):193-5.
3. Coates W, Runde DP, Yarris LM, et al. Creating a cadre of fellowship-trained medical educators: a qualitative study of faculty development program leaders’ perspectives and advice. Acad Med. 2016;91:1696-1704.
4. Steinert Y, Mann K, Anderson B, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: A 10-year update: BEME Guide No. 40. Med Teach. 2016;38(8):769-86.
5. Tekain A, Artino AR. AM last page: master’s degree in health professions education programs. Acad Med. 2013;88(9):1399.
6. Bordage G. Conceptual Frameworks to Illuminate and Magnify. Med Educ. 2009;43(4):312-9.
7. Bordage G. Reasons reviewers reject and accept manuscripts: the strengths and weaknesses in medical education reports. Acad Med. 2001;76(9):889-96.
8. Cook DA, Beckman TJ, Bordage G. Quality of reporting of experimental students in medical education: a systematic review. Med Educ. 2007;41(8):737-45.
9. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 2006.
10. McNeil RC. A Program Evaluation Model: Using Bloom’s taxonomy to identify outcome indicators in outcomes-based program evaluations. J Adult Educ. 2011;40(2):24-9.
11. Miller GE. The assessement of clinical skills/competence/performance. Acad Med. 1990;63(9Suppl):s63-7.
12. Moore DE, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1-15.
13. Armstrong EG, Barsion SJ. Using an outcomes-logic-model approach to evaluate a faculty development program for medical educators. Acad Med. 2006;81(5):483-8.
14. Artino AR, La Rochelle JS, Dezee KJ, et al. Developing questionnaires for educational research: AMEE Guide No. 87. Med Teach. 2014;36(6):463-74.
15. Govaerts M, Vleuten CPM. Validity in work-based assessment: expanding our horizons. Med Educ. 2013;47(12):1164-74.
16. Kolb D. Experiential Learning. 1984.
17. Messman AM, Walker I. Development of a case-based reading curriculum and Its effect on resident reading. West J Emerg Med. 2018;19(1):139-141.
18. Norman G, Young M, Brooks L. Non-analytical models of clinical reasoning: the role of clinical experience. Med Educ. 2007;41(12):1140-5.
19. Hatala RM, Brooks LR, Norman GR. Influence of a single example upon subsequent electrocardiogram interpretation. Teach Learn Med. 1999;11(2):110-7.
20. Norman GR, Rosenthal D, Brooks LR, et al. The development of expertise in dermatology. Arch Dermatol. 1989;125(8):1063-8.
21. Norman GR, Brooks LR. The non-analytical basis of clinical reasoning. Adv Health Sci Educ Theory Pract. 1997;2(2):173-84.
22. Cook DA, Levinson AJ, Garside S, et al. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(5):909-22.
23. Wolff M, Wagner MJ, Poznanski S, et al. Not another boring lecture: engaging learners with active learning techniques. J Emerg Med. 2015;48(1):85-93.
24. Boud D, Middleton H. Learning from others at work: communities of practice and informal learning. J Workpl Learn. 2003;15(5):194-202.
25. Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. 1991.
26. Kern DE, Thomas PA, Hughes MT. Curriculum Development for Medical Education: A Six-Step Approach. 2009.
27. Merritt C, Gaines SA, Smith J, et al. A novel curriculum to optimize emergency medicine residents’ exposure to pediatrics. West J Emerg Med. 2017;18(1):14-9.
28. Skeff KM, Stratos GA, Bergen MR. Evaluation of a Medical faculty development program: a comparison of traditional pre/post and retrospective self-assessment ratings. Eval Health Prof. 1992;15(3):350-66.
|Increased learning and improved retention from teaching activities.
Results from questionnaires that accurately reflect what authors are attempting to understand.
Curricula and programs that successfully meet their intended goals.
Valid assessments of performance.
Outcomes that demonstrate success to peers, administration, and the larger education community.
The oppportunity to publish outcomes if the project is innovative and/or scholarly in its approach.
Modeling an evidence-based approach for learners to emulate.