Archives

Medical Student Performance on the National Board of Medical Examiners Emergency Medicine Advanced Clinical Examination and the National Emergency Medicine M4 Exams

Volume 16 , Issue 6, November 2015.
Katherine Hiller, MD, MPH, et al.

Introduction: In April 2013, the National Board of Medical Examiners (NBME) released an Advanced
Clinical Examination (ACE) in emergency medicine (EM). In addition to this new resource, CDEM
(Clerkship Directors in EM) provides two online, high-quality, internally validated examinations.
National usage statistics are available for all three examinations, however, it is currently unknown how
students entering an EM residency perform as compared to the entire national cohort. This information
may help educators interpret examination scores of both EM-bound and non-EM-bound students.
Objectives: The objective of this study was to compare EM clerkship examination performance
between students who matched into an EM residency in 2014 to students who did not. We made
comparisons were made using the EM-ACE and both versions of the National fourth year medical
student (M4) EM examinations.
Method: In this retrospective multi-institutional cohort study, the EM-ACE and either Version 1 (V1)
or 2 (V2) of the National EM M4 examination was given to students taking a fourth-year EM rotation
at five institutions between April 2013 to February 2014. We collected examination performance,
including the scaled EM-ACE score, and percent correct on the EM M4 exams, and 2014 NRMP
Match status. Student t-tests were performed on the examination averages of students who matched
in EM as compared with those who did not.
Results: A total of 606 students from five different institutions took both the EM-ACE and one of the
EM M4 exams; 94 (15.5%) students matched in EM in the 2014 Match. The mean score for EM-bound
students on the EM-ACE, V1 and V2 of the EM M4 exams were 70.9 (n=47, SD=9.0), 84.4 (n=36,
SD=5.2), and 83.3 (n=11, SD=6.9), respectively. Mean scores for non-EM-bound students were 68.0
(n=256, SD=9.7), 82.9 (n=243, SD=6.5), and 74.5 (n=13, SD=5.9). There was a significant difference
in mean scores in EM-bound and non-EM-bound student for the EM-ACE (p=0.05) and V2 (p<0.01)
but not V1 (p=0.18) of the National EM M4 examination.
Conclusion: Students who successfully matched in EM performed better on all three exams at the
end of their EM clerkship.

Read More

Competency Assessment in Senior Emergency Medicine Residents for Core Ultrasound Skills

Volume 16, Issue 6, November 2015.
Jessica N. Schmidt, MD, MPH, et al.

Introduction: Quality resident education in point-of-care ultrasound (POC US) is becoming
increasingly important in emergency medicine (EM); however, the best methods to evaluate
competency in graduating residents has not been established. We sought to design and implement
a rigorous assessment of image acquisition and interpretation in POC US in a cohort of graduating
residents at our institution.
Methods: We evaluated nine senior residents in both image acquisition and image interpretation for
five core US skills (focused assessment with sonography for trauma (FAST), aorta, echocardiogram
(ECHO), pelvic, central line placement). Image acquisition, using an observed clinical skills exam
(OSCE) directed assessment with a standardized patient model. Image interpretation was measured
with a multiple-choice exam including normal and pathologic images.
Results: Residents performed well on image acquisition for core skills with an average score of
85.7% for core skills and 74% including advanced skills (ovaries, advanced ECHO, advanced aorta).
Residents scored well but slightly lower on image interpretation with an average score of 76%.
Conclusion: Senior residents performed well on core POC US skills as evaluated with a rigorous
assessment tool. This tool may be developed further for other EM programs to use for graduating
resident evaluation.

Read More

Mentoring during Medical School and Match Outcome among Emergency Medicine Residents

Volume 16, Issue 6, November 2015.
Erin Dehon, PhD, et al.

Introduction: Few studies have documented the value of mentoring for medical students, and
research has been limited to more subjective (e.g., job satisfaction, perceived career preparation)
rather than objective outcomes. This study examined whether having a mentor is associated with
match outcome (where a student matched based on their rank order list [ROL]).
Methods: We sent a survey link to all emergency medicine (EM) program coordinators to distribute
to their residents. EM residents were surveyed about whether they had a mentor during medical
school. Match outcome was assessed by asking residents where they matched on their ROL (e.g.,
first choice, fifth choice). They were also asked about rank in medical school, type of degree (MD vs.
DO), and performance on standardized tests. Residents who indicated having a mentor completed
the Mentorship Effectiveness Scale (MES), which evaluates behavioral characteristics of the
mentor and yields a total score. We assessed correlations among these variables using Pearson’s
correlation coefficient. Post-hoc analysis using independent sample t-test was conducted to compare
differences in the MES score between those who matched to their first or second choice vs. third or
higher choice.
Results: Participants were a convenience sample of 297 EM residents. Of those, 199 (67%)
reported having a mentor during medical school. Contrary to our hypothesis, there was no significant
correlation between having a mentor and match outcome (r=0.06, p=0.29). Match outcome was
associated with class rank (r=0.13, p=0.03), satisfaction with match outcome (r= -0.37, p<0.001),
and type of degree (r=0.12, p=0.04). Among those with mentors, a t-test revealed that the MES
score was significantly higher among those who matched to their first or second choice (M=51.31,
SD=10.13) compared to those who matched to their third or higher choice (M=43.59, SD=17.12),
t(194)=3.65, p<0.001, d=0.55.
Conclusion: Simply having a mentor during medical school does not impact match outcome, but
having an effective mentor is associated with a more favorable match outcome among medical
students applying to EM programs.

Read More

Emergency Medicine Residents Consistently Rate Themselves Higher than Attending Assessments on ACGME Milestones

Volume 16, Issue 6, November 2015.
Katja Goldflam, MD, et al.

Introduction: In 2012 the Accreditation Council for Graduate Medical Education (ACGME)
introduced the Next Accreditation System (NAS), which implemented milestones to assess the
competency of residents and fellows. While attending evaluation and feedback is crucial for resident
development, perhaps equally important is a resident’s self-assessment. If a resident does not
accurately self-assess, clinical and professional progress may be compromised. The objective of our
study was to compare emergency medicine (EM) resident milestone evaluation by EM faculty with
the same resident’s self-assessment.
Methods: This is an observational, cross-sectional study that was performed at an academic,
four-year EM residency program. Twenty-five randomly chosen residents completed milestone
self-assessment using eight ACGME sub-competencies deemed by residency leadership as
representative of core EM principles. These residents were also evaluated by 20 faculty members.
The milestone levels were evaluated on a nine-point scale. We calculated the average difference
between resident self-ratings and faculty ratings, and used sample t-tests to determine statistical
significance of the difference in scores.
Results: Eighteen residents evaluated themselves. Each resident was assessed by an average
of 16 attendings (min=10, max=20). Residents gave themselves statistically significant higher
milestone ratings than attendings did for each sub-competency examined (p<0.0001).
Conclusion: Residents over-estimated their abilities in every sub-competency assessed. This
underscores the importance of feedback and assessment transparency. More attention needs to be
paid to methods by which residency leadership can make residents’ self-perception of their clinical
ability more congruent with that of their teachers and evaluators. The major limitation of our study is
small sample size of both residents and attendings.

Read More

Integration of a Blog into an Emergency Medicine Residency Curriculum

Volume 16, Issue 6, November 2015.
Jay Khadpe, MD, et al.

Technologies and techniques for knowledge translation
are rapidly evolving and there is a need for graduate medical
education (GME) curricula to keep up with these advances
to reach our learners in an effective manner. Technologies
such as blogs, microblogs, wikis, podcasts, and vodcasts have
the potential to expand upon the current didactic models by
adding dimensions and engaging learners in modalities not
previously available.

Read More

Ultrasound Training in the Emergency Medicine Clerkship

Volume 16, Issue 6, November 2015.
Mark Favot, MD, et al.

Introduction: The curriculum in most emergency medicine (EM) clerkships includes very little
formalized training in point-of-care ultrasound. Medical schools have begun to implement ultrasound
training in the pre-clinical curriculum, and the EM clerkship is an appropriate place to build upon this
training. The objectives are (1) to evaluate the effectiveness of implementing a focused ultrasound
curriculum within an established EM clerkship and (2) to obtain feedback from medical students
regarding the program.
Methods: We conducted a prospective cohort study of medical students during an EM clerkship
year from July 1, 2011, to June 30, 2012. Participants included fourth-year medical students
(n=45) enrolled in the EM clerkship at our institution. The students underwent a structured program
focused on the focused assessment with sonography for trauma exam and ultrasound-guided
vascular access. At the conclusion of the rotation, they took a 10-item multiple choice test assessing
knowledge and image interpretation skills. A cohort of EM residents (n=20) also took the multiple
choice test but did not participate in the training with the students. We used an independent samples
t-test to examine differences in test scores between the groups.
Results: The medical students in the ultrasound training program scored significantly higher on
the multiple-choice test than the EM residents, t(63)=2.3, p<0.05. The feedback from the students
indicated that 82.8% were using ultrasound on their current rotations and the majority (55.2%) felt
that the one-on-one scanning shift was the most valuable aspect of the curriculum.
Discussion: Our study demonstrates support for an ultrasound training program for medical
students in the EM clerkship. After completing the training, students were able to perform similarly to
EM residents on a knowledge-based exam.

Read More

Assessing EM Patient Safety and Quality Improvement Milestones Using a Novel Debate Format

Volume 16, Issue 6, November 2015.
Mira Mamtani, MD, et al.

Graduate medical education is increasingly focused on patient safety and quality improvement; training
programs must adapt their curriculum to address these changes. We propose a novel curriculum for
emergency medicine (EM) residency training programs specifically addressing patient safety, systemsbased
management, and practice-based performance improvement, called “EM Debates.” Following
implementation of this educational curriculum, we performed a cross-sectional study to evaluate the
curriculum through resident self-assessment. Additionally, a cross-sectional study to determine the
ED clinical competency committee’s (CCC) ability to assess residents on specific competencies was
performed. Residents were overall very positive towards the implementation of the debates. Of those
participating in a debate, 71% felt that it improved their individual performance within a specific topic,
and 100% of those that led a debate felt that they could propose an evidence-based approach to
a specific topic. The CCC found that it was easier to assess milestones in patient safety, systemsbased
management, and practice-based performance improvement (sub-competencies 16, 17,
and 19) compared to prior to the implementation of the debates. The debates have been a helpful
venue to teach EM residents about patient safety concepts, identifying medical errors, and process
improvement.

Read More

Implementation of an Education Value Unit (EVU) System to Recognize Faculty Contributions

Volume 16, Issue 6, November 2015.
Joseph House, MD, et al.

Introduction: Faculty educational contributions are hard to quantify, but in an era of limited
resources it is essential to link funding with effort. The purpose of this study was to determine the
feasibility of an educational value unit (EVU) system in an academic emergency department and
to examine its effect on faculty behavior, particularly on conference attendance and completion of
trainee evaluations.
Methods: A taskforce representing education, research, and clinical missions was convened
to develop a method of incentivizing productivity for an academic emergency medicine faculty.
Domains of educational contributions were defined and assigned a value based on time expended.
A 30-hour EVU threshold for achievement was aligned with departmental goals. Targets included
educational presentations, completion of trainee evaluations and attendance at didactic conferences.
We analyzed comparisons of performance during the year preceding and after implementation.
Results: Faculty (N=50) attended significantly more didactic conferences (22.7 hours v. 34.5
hours, p<0.005) and completed more trainee evaluations (5.9 v. 8.8 months, p<0.005). During
the pre-implementation year, 84% (42/50) met the 30-hour threshold with 94% (47/50) meeting
post-implementation (p=0.11). Mean total EVUs increased significantly (94.4 hours v. 109.8 hours,
p=0.04) resulting from increased conference attendance and evaluation completion without a change
in other categories.
Conclusion: In a busy academic department there are many work allocation pressures. An EVU
system integrated with an incentive structure to recognize faculty contributions increases the
importance of educational responsibilities. We propose an EVU model that could be implemented
and adjusted for differing departmental priorities at other academic departments.

Read More

Correlation of the National Board of Medical Examiners Emergency Medicine Advanced Clinical Examination Given in July to Intern American Board of Emergency Medicine in-training Examination Scores: A Predictor of Performance?

Volume 16, Issue 6, November 2015.
Katherine Hiller, MD, MPH

Introduction: There is great variation in the knowledge base of Emergency Medicine (EM) interns
in July. The first objective knowledge assessment during residency does not occur until eight months
later, in February, when the American Board of EM (ABEM) administers the in-training examination
(ITE). In 2013, the National Board of Medical Examiners (NBME) released the EM Advanced Clinical
Examination (EM-ACE), an assessment intended for fourth-year medical students. Administration of
the EM-ACE to interns at the start of residency may provide an earlier opportunity to assess the new
EM residents’ knowledge base. The primary objective of this study was to determine the correlation
of the NBME EM-ACE, given early in residency, with the EM ITE. Secondary objectives included
determination of the correlation of the United States Medical Licensing Examination (USMLE) Step 1
or 2 scores with early intern EM-ACE and ITE scores and the effect, if any, of clinical EM experience
on examination correlation.
Methods: This was a multi-institutional, observational study. Entering EM interns at six residencies
took the EM-ACE in July 2013 and the ABEM ITE in February 2014. We collected scores for the EMACE
and ITE, age, gender, weeks of clinical EM experience in residency prior to the ITE, and USMLE
Step 1 and 2 scores. Pearson’s correlation and linear regression were performed.
Results: Sixty-two interns took the EM-ACE and the ITE. The Pearson’s correlation coefficient
between the ITE and the EM-ACE was 0.62. R-squared was 0.5 (adjusted 0.4). The coefficient of
determination was 0.41 (95% CI [0.3-0.8]). For every increase of one in the scaled EM-ACE score,
we observed a 0.4% increase in the EM in-training score. In a linear regression model using all
available variables (EM-ACE, gender, age, clinical exposure to EM, and USMLE Step 1 and Step 2
scores), only the EM-ACE score was significantly associated with the ITE (p<0.05). We observed
significant colinearity among the EM-ACE, ITE and USMLE scores. Gender, age and number of
weeks of EM prior to the ITE had no effect on the relationship between EM-ACE and the ITE.
Conclusion Given early during intern year, the EM-ACE score showed positive correlation with ITE.
Clinical EM experience prior to the in-training exam did not affect the correlation.

Read More

Effect of a Novel Engagement Strategy Using Twitter on Test Performance

Volume 16, Issue 6, November 2015.
Amanda L. Webb, MS, et al.

Introduction: Medical educators in recent years have been using social media for more
penetrance to technologically-savvy learners. The utility of using Twitter for curriculum content
delivery has not been studied. We sought to determine if participation in a social media-based
educational supplement would improve student performance on a test of clinical images at the end
of the semester.
Methods: 116 second-year medical students were enrolled in a lecture-based clinical medicine
course, in which images of common clinical exam findings were presented. An additional, optional
assessment was performed on Twitter. Each week, a clinical presentation and physical exam image
(not covered in course lectures) were distributed via Twitter, and students were invited to guess the
exam finding or diagnosis. After the completion of the course, students were asked to participate in a
slideshow “quiz” with 24 clinical images, half from lecture and half from Twitter.
Results: We conducted a one-way analysis of variance to determine the effect Twitter participation
had on total, Twitter-only, and lecture-only scores. Twitter participation data was collected from the
end-of-course survey and was defined as submitting answers to the Twitter-only questions “all or
most of the time”, “about half of the time”, and “little or none of the time.” We found a significant
difference in overall scores (p<0.001) and in Twitter-only scores (p<0.001). There was not enough
evidence to conclude a significant difference in lecture-only scores (p=0.124). Students who
submitted answers to Twitter “all or most of the time” or “about half the time” had significantly higher
overall scores and Twitter-only scores (p<0.001 and p<0.001, respectively) than those students who
only submitted answers “little or none of the time.”
Conclusion: While students retained less information from Twitter than from traditional classroom
lecture, some retention was noted. Future research on social media in medical education would
benefit from clear control and experimental groups in settings where quantitative use of social media
could be measured. Ultimately, it is unlikely for social media to replace lecture in medical curriculum;
however, there is a reasonable role for social media as an adjunct to traditional medical education.

Read More

Contact Information

WestJEM/ Department of Emergency Medicine
UC Irvine Health

333 The City Blvd. West, Rt 128-01
Suite 640
Orange, CA 92868, USA
Phone: 1-714-456-6389
Email: editor@westjem.org

CC-BY_icon.svg

WestJEM
ISSN: 1936-900X
e-ISSN: 1936-9018

CPC-EM
ISSN: 2474-252X

Our Philosophy

Emergency Medicine is a specialty which closely reflects societal challenges and consequences of public policy decisions. The emergency department specifically deals with social injustice, health and economic disparities, violence, substance abuse, and disaster preparedness and response. This journal focuses on how emergency care affects the health of the community and population, and conversely, how these societal challenges affect the composition of the patient population who seek care in the emergency department. The development of better systems to provide emergency care, including technology solutions, is critical to enhancing population health.