Archives

Preparing Emergency Medicine Residents to Disclose Medical Error Using Standardized Patients

Carmen N. Spalding, PhD, et al.

Emergency Medicine (EM) is a unique clinical learning environment. The American College of Graduate Medical Education Clinical Learning Environment Review Pathways to Excellence calls for “hands-on training” of disclosure of medical error (DME) during residency. Training and practicing key elements of DME using standardized patients (SP) may enhance preparedness among EM residents in performing this crucial skill in a clinical setting.

Read More

Pilot Point-of-Care Ultrasound Curriculum at Harvard Medical School: Early Experience

Joshua S. Rempell, MD, MPH et al.

Point-of-care ultrasound (POCUS) is expanding across all medical specialties. As the benefits of US technology are becoming apparent, efforts to integrate US into pre-clinical medical education are growing. Our objective was to describe our process of integrating POCUS as an educational tool into the medical school curriculum and how such efforts are perceived by students.

Read More

Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study

Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
Teresa M. Chan, MD, MHPE, et al.

Online education resources (OERs), like blogs and podcasts, increasingly augment or replace traditional medical education resources such as textbooks and lectures. Trainees’ ability to evaluate these resources is poor, and few quality assessment aids have been developed to assist them. This study aimed to derive a quality evaluation instrument for this purpose.

Read More

Introducing a Fresh Cadaver Model for Ultrasound-guided Central Venous Access Training in Undergraduate Medical Education

Volume 17, Issue 3, May 2016
Ryan Miller, BS et al.

Introduction: Over the past decade, medical students have witnessed a decline in the
opportunities to perform technical skills during their clinical years. Ultrasound-guided central
eScholarship provides open access, scholarly publishing
services to the University of California and delivers a dynamic
research platform to scholars worldwide.
venous access (USG-CVA) is a critical procedure commonly performed by emergency medicine,
anesthesia, and general surgery residents, often during their first month of residency. However, the
acquisition of skills required to safely perform this procedure is often deficient upon graduation from
medical school. To ameliorate this lack of technical proficiency, ultrasound simulation models have
been introduced into undergraduate medical education to train venous access skills. Criticisms of
simulation models are the innate lack of realistic tactile qualities, as well as the lack of anatomical
variances when compared to living patients. The purpose of our investigation was to design and
evaluate a life-like and reproducible training model for USG-CVA using a fresh cadaver.
Methods: This was a cross-sectional study at an urban academic medical center. An 18-point
procedural knowledge tool and an 18-point procedural skill evaluation tool were administered
during a cadaver lab at the beginning and end of the surgical clerkship. During the fresh cadaver
lab, procedure naïve third-year medical students were trained on how to perform ultrasoundguided
central venous access of the femoral and internal jugular vessels. Preparation of the fresh
cadaver model involved placement of a thin-walled latex tubing in the anatomic location of the
femoral and internal jugular vein respectively.
Results: Fifty-six third-year medical students participated in this study during their surgical
clerkship. The fresh cadaver model provided high quality and lifelike ultrasound images despite
numerous cannulation attempts. Technical skill scores improved from an average score of 3 to 12
(p<0.001) and procedural knowledge scores improved from an average score of 4 to 8 (p<0.001).
Conclusion: The use of this novel cadaver model prevented extravasation of fluid, maintained
ultrasound-imaging quality, and proved to be an effective educational model allowing third-year
medical students to improve and maintain their technical skills.

Read More

Coordinating a Team Response to Behavioral Emergencies in the Emergency Department: A Simulation-Enhanced Interprofessional Curriculum

Volume 16, Issue 6, November 2015.
Ambrose H. Wong, MD, et al.

Assessment of medical students in their emergency
medicine (EM) clerkship is often based on clinical shift
evaluations and written examinations. Clinical evaluations
offer some insight into students’ ability to apply knowledge to
clinical problems, but are notoriously unreliable, with score
variance that may be driven as much by error as by actual
student performance.

Read More

Development of an Objective Structured Clinical Examination for Assessment of Clinical Skills in an Emergency Medicine Clerkship

Volume 16, Issue 6, November 2015.
Sharon Bord, MD, et al.

Assessment of medical students in their emergency
medicine (EM) clerkship is often based on clinical shift
evaluations and written examinations. Clinical evaluations
offer some insight into students’ ability to apply knowledge to
clinical problems, but are notoriously unreliable, with score
variance that may be driven as much by error as by actual
student performance.

Read More

Direct Observation Assessment of Milestones: Problems with Reliability

Volume 16, Issue 6, November 2015.
Meghan Schott, MD, et al.

Introduction: Emergency medicine (EM) milestones are used to assess residents’ progress. While
some milestone validity evidence exists, there is a lack of standardized tools available to reliably
assess residents. Inherent to this is a concern that we may not be truly measuring what we intend
to assess. The purpose of this study was to design a direct observation milestone assessment
instrument supported by validity and reliability evidence. In addition, such a tool would further lend
validity evidence to the EM milestones by demonstrating their accurate measurement.
Methods: This was a multi-center, prospective, observational validity study conducted at eight
institutions. The Critical Care Direct Observation Tool (CDOT) was created to assess EM residents
during resuscitations. This tool was designed using a modified Delphi method focused on content,
response process, and internal structure validity. Paying special attention to content validity, the
CDOT was developed by an expert panel, maintaining the use of the EM milestone wording. We
built response process and internal consistency by piloting and revising the instrument. Raters
were faculty who routinely assess residents on the milestones. A brief training video on utilization
of the instrument was completed by all. Raters used the CDOT to assess simulated videos of three
residents at different stages of training in a critical care scenario. We measured reliability using
Fleiss’ kappa and interclass correlations.
Results: Two versions of the CDOT were used: one used the milestone levels as global rating
scales with anchors, and the second reflected a current trend of a checklist response system.
Although the raters who used the CDOT routinely rate residents in their practice, they did not score
the residents’ performances in the videos comparably, which led to poor reliability. The Fleiss’ kappa
of each of the items measured on both versions of the CDOT was near zero.
Conclusion: The validity and reliability of the current EM milestone assessment tools have yet to
be determined. This study is a rigorous attempt to collect validity evidence in the development of
a direct observation assessment instrument. However, despite strict attention to validity evidence,
inter-rater reliability was low. The potential sources of reducible variance include rater- and
instrument-based error. Based on this study, there may be concerns for the reliability of other EM
milestone assessment tools that are currently in use.

Read More

Ultrasound Training in the Emergency Medicine Clerkship

Volume 16, Issue 6, November 2015.
Mark Favot, MD, et al.

Introduction: The curriculum in most emergency medicine (EM) clerkships includes very little
formalized training in point-of-care ultrasound. Medical schools have begun to implement ultrasound
training in the pre-clinical curriculum, and the EM clerkship is an appropriate place to build upon this
training. The objectives are (1) to evaluate the effectiveness of implementing a focused ultrasound
curriculum within an established EM clerkship and (2) to obtain feedback from medical students
regarding the program.
Methods: We conducted a prospective cohort study of medical students during an EM clerkship
year from July 1, 2011, to June 30, 2012. Participants included fourth-year medical students
(n=45) enrolled in the EM clerkship at our institution. The students underwent a structured program
focused on the focused assessment with sonography for trauma exam and ultrasound-guided
vascular access. At the conclusion of the rotation, they took a 10-item multiple choice test assessing
knowledge and image interpretation skills. A cohort of EM residents (n=20) also took the multiple
choice test but did not participate in the training with the students. We used an independent samples
t-test to examine differences in test scores between the groups.
Results: The medical students in the ultrasound training program scored significantly higher on
the multiple-choice test than the EM residents, t(63)=2.3, p<0.05. The feedback from the students
indicated that 82.8% were using ultrasound on their current rotations and the majority (55.2%) felt
that the one-on-one scanning shift was the most valuable aspect of the curriculum.
Discussion: Our study demonstrates support for an ultrasound training program for medical
students in the EM clerkship. After completing the training, students were able to perform similarly to
EM residents on a knowledge-based exam.

Read More

Assessing EM Patient Safety and Quality Improvement Milestones Using a Novel Debate Format

Volume 16, Issue 6, November 2015.
Mira Mamtani, MD, et al.

Graduate medical education is increasingly focused on patient safety and quality improvement; training
programs must adapt their curriculum to address these changes. We propose a novel curriculum for
emergency medicine (EM) residency training programs specifically addressing patient safety, systemsbased
management, and practice-based performance improvement, called “EM Debates.” Following
implementation of this educational curriculum, we performed a cross-sectional study to evaluate the
curriculum through resident self-assessment. Additionally, a cross-sectional study to determine the
ED clinical competency committee’s (CCC) ability to assess residents on specific competencies was
performed. Residents were overall very positive towards the implementation of the debates. Of those
participating in a debate, 71% felt that it improved their individual performance within a specific topic,
and 100% of those that led a debate felt that they could propose an evidence-based approach to
a specific topic. The CCC found that it was easier to assess milestones in patient safety, systemsbased
management, and practice-based performance improvement (sub-competencies 16, 17,
and 19) compared to prior to the implementation of the debates. The debates have been a helpful
venue to teach EM residents about patient safety concepts, identifying medical errors, and process
improvement.

Read More

Implementation of an Education Value Unit (EVU) System to Recognize Faculty Contributions

Volume 16, Issue 6, November 2015.
Joseph House, MD, et al.

Introduction: Faculty educational contributions are hard to quantify, but in an era of limited
resources it is essential to link funding with effort. The purpose of this study was to determine the
feasibility of an educational value unit (EVU) system in an academic emergency department and
to examine its effect on faculty behavior, particularly on conference attendance and completion of
trainee evaluations.
Methods: A taskforce representing education, research, and clinical missions was convened
to develop a method of incentivizing productivity for an academic emergency medicine faculty.
Domains of educational contributions were defined and assigned a value based on time expended.
A 30-hour EVU threshold for achievement was aligned with departmental goals. Targets included
educational presentations, completion of trainee evaluations and attendance at didactic conferences.
We analyzed comparisons of performance during the year preceding and after implementation.
Results: Faculty (N=50) attended significantly more didactic conferences (22.7 hours v. 34.5
hours, p<0.005) and completed more trainee evaluations (5.9 v. 8.8 months, p<0.005). During
the pre-implementation year, 84% (42/50) met the 30-hour threshold with 94% (47/50) meeting
post-implementation (p=0.11). Mean total EVUs increased significantly (94.4 hours v. 109.8 hours,
p=0.04) resulting from increased conference attendance and evaluation completion without a change
in other categories.
Conclusion: In a busy academic department there are many work allocation pressures. An EVU
system integrated with an incentive structure to recognize faculty contributions increases the
importance of educational responsibilities. We propose an EVU model that could be implemented
and adjusted for differing departmental priorities at other academic departments.

Read More

Correlation of the National Board of Medical Examiners Emergency Medicine Advanced Clinical Examination Given in July to Intern American Board of Emergency Medicine in-training Examination Scores: A Predictor of Performance?

Volume 16, Issue 6, November 2015.
Katherine Hiller, MD, MPH

Introduction: There is great variation in the knowledge base of Emergency Medicine (EM) interns
in July. The first objective knowledge assessment during residency does not occur until eight months
later, in February, when the American Board of EM (ABEM) administers the in-training examination
(ITE). In 2013, the National Board of Medical Examiners (NBME) released the EM Advanced Clinical
Examination (EM-ACE), an assessment intended for fourth-year medical students. Administration of
the EM-ACE to interns at the start of residency may provide an earlier opportunity to assess the new
EM residents’ knowledge base. The primary objective of this study was to determine the correlation
of the NBME EM-ACE, given early in residency, with the EM ITE. Secondary objectives included
determination of the correlation of the United States Medical Licensing Examination (USMLE) Step 1
or 2 scores with early intern EM-ACE and ITE scores and the effect, if any, of clinical EM experience
on examination correlation.
Methods: This was a multi-institutional, observational study. Entering EM interns at six residencies
took the EM-ACE in July 2013 and the ABEM ITE in February 2014. We collected scores for the EMACE
and ITE, age, gender, weeks of clinical EM experience in residency prior to the ITE, and USMLE
Step 1 and 2 scores. Pearson’s correlation and linear regression were performed.
Results: Sixty-two interns took the EM-ACE and the ITE. The Pearson’s correlation coefficient
between the ITE and the EM-ACE was 0.62. R-squared was 0.5 (adjusted 0.4). The coefficient of
determination was 0.41 (95% CI [0.3-0.8]). For every increase of one in the scaled EM-ACE score,
we observed a 0.4% increase in the EM in-training score. In a linear regression model using all
available variables (EM-ACE, gender, age, clinical exposure to EM, and USMLE Step 1 and Step 2
scores), only the EM-ACE score was significantly associated with the ITE (p<0.05). We observed
significant colinearity among the EM-ACE, ITE and USMLE scores. Gender, age and number of
weeks of EM prior to the ITE had no effect on the relationship between EM-ACE and the ITE.
Conclusion Given early during intern year, the EM-ACE score showed positive correlation with ITE.
Clinical EM experience prior to the in-training exam did not affect the correlation.

Read More

Contact Information

WestJEM/ Department of Emergency Medicine
UC Irvine Health

3800 W Chapman Ave Ste 3200
Orange, CA 92868, USA
Phone: 1-714-456-6389
Email: editor@westjem.org

CC-BY_icon.svg

WestJEM
ISSN: 1936-900X
e-ISSN: 1936-9018

CPC-EM
ISSN: 2474-252X

Our Philosophy

Emergency Medicine is a specialty which closely reflects societal challenges and consequences of public policy decisions. The emergency department specifically deals with social injustice, health and economic disparities, violence, substance abuse, and disaster preparedness and response. This journal focuses on how emergency care affects the health of the community and population, and conversely, how these societal challenges affect the composition of the patient population who seek care in the emergency department. The development of better systems to provide emergency care, including technology solutions, is critical to enhancing population health.