Presented at The Gold Foundation Symposium, “How Are We Teaching Humanism in Medicine and What is Working?” September 27–29, 2007, Chicago, IL; and the 9th Annual International Meeting on Simulation in Healthcare (IMSH), January 10–14, 2009, Lake Buena Vista, FL.
Can Unannounced Standardized Patients Assess Professionalism and Communication Skills in the Emergency Department?
Article first published online: 10 AUG 2009
© 2009 by the Society for Academic Emergency Medicine
Academic Emergency Medicine
Volume 16, Issue 9, pages 915–918, September 2009
How to Cite
Zabar, S., Ark, T., Gillespie, C., Hsieh, A., Kalet, A., Kachur, E., Manko, J. and Regan, L. (2009), Can Unannounced Standardized Patients Assess Professionalism and Communication Skills in the Emergency Department?. Academic Emergency Medicine, 16: 915–918. doi: 10.1111/j.1553-2712.2009.00510.x
Supported by Picker Institute Challenge Grant 2007.
- Issue published online: 1 SEP 2009
- Article first published online: 10 AUG 2009
- Received February 27, 2009; revision received May 21, 2009; accepted May 22, 2009.
- standardized patients;
- graduate medical education;
Objectives: The authors piloted unannounced standardized patients (USPs) in an emergency medicine (EM) residency to test feasibility, acceptability, and performance assessment of professionalism and communication skills.
Methods: Fifteen postgraduate year (PGY)-2 EM residents were scheduled to be visited by two USPs while working in the emergency department (ED). Multidisciplinary support was utilized to ensure successful USP introduction. Scores (% well done) were calculated for communication and professionalism skills using a 26-item, behaviorally anchored checklist. Residents’ attitudes toward USPs and USP detection were also surveyed.
Results: Of 27 USP encounters attempted, 17 (62%) were successfully completed. The detection rate was 44%. Eighty-three percent of residents who encountered a USP felt that the encounter did not hinder daily practice and did not make them uncomfortable (86%) or suspicious of patients (71%). Overall, residents received a mean score of 60% for communication items rated “well done” (SD ± 28%, range = 23%–100%) and 53% of professionalism items “well done” (SD ± 20%, range = 23%-85%). Residents’ communication skills were weakest for patient education and counseling (mean = 43%, SD ± 31%), compared with information gathering (68%, SD ± 36% and relationship development (62%, SD ± 32%). Scores of residents who detected USPs did not differ from those who had not.
Conclusions: Implementing USPs in the ED is feasible and acceptable to staff. The unpredictability of the ED, specifically resident schedules, accounted for most incomplete encounters. USPs may represent a new way to assess real-time resident physician performance without the need for faculty resources or the bias introduced by direct observation.
What options exist for assessing communication and professionalism skills? As residency programs seek to comply with the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project,1 robust modalities to evaluate clinical performance and effectiveness of education are in high demand. The ACGME’s Toolbox contains numerous tools for assessing communication skills,2 but many of these rely on self-assessment or trained observers present during patient encounters. Patient complaints and postvisit surveys are useful for obtaining information, but offer limited opportunity for physicians to translate feedback into practice change. Unannounced standardized patients (USPs)3–5 present a method of measuring physicians’ communication and professionalism skills in a real practice setting without the artificiality inherent in observed structured clinical exams (OSCEs).6–8
We hypothesized that USPs can provide a real-time, accurate alternative to direct observation and OSCEs. The purpose of this project was to 1) describe the process of conducting a USP program in an emergency department (ED), 2) determine if implementing USPs in the ED is feasible, and 3) present preliminary results of a USP performance assessment.
This was a prospective, nonrandomized, cohort study to assess professionalism and communication abilities of emergency medicine (EM) residents using USPs. Informed consent was obtained from all participants. Research activities in this study were approved by the New York University School of Medicine Institutional Review Board through a resident registry wherein residents are asked to consent to allow inclusion of their educational and performance data in a research database. Data, therefore, are reported only for those residents for whom such consent was obtained.
Study Setting and Population
The Bellevue ED is a busy Level 1 trauma center at an academic medical center in New York City. The ED sees approximately 100,000 visits per year.
Fifteen EM residents in their second year of postgraduate training (PGY-2) participated in the EM Professionalism and Communication Training (EMPACT) Program. At the conclusion of the EMPACT training, residents were informed that they might be visited by USPs during their subsequent time working in the ED. However, residents were blinded as to the exact date of the visit or patient complaint.
Logistics. We required involvement from most ED staff areas including nurses, attending physicians, medical records (MRs), registration, informatics, and radiology. To ensure fidelity for each USP visit, we created a preexisting MR with a unique number, patient name and identifying information, prior visits, and test results. Each resident was scheduled to receive two USPs in urgent care (where residents’ schedules were relatively predictable) during the 4 to 6 weeks after the EMPACT curriculum.
USP Scenarios. We used two USP cases previously validated in OSCEs, representing common ED challenges and requiring only communication-based interventions. In the first case (a misread x-ray), residents needed to educate an angry patient recalled for a misread x-ray (skills: delivering bad news, dealing with a challenging patient, accountability), and in the second (a repeat visitor), care for a dissatisfied patient with chronic pain who repeatedly uses the ED (skills: handling emotion, patient education, accountability).
USP Training. Eight actors were recruited. On average, each received seven hours of training consisting of 1) discussion of character and situation, 2) calibration of emotional tone, 3) role play for standardization, 4) practice with attending and chief residents for realism, 5) review of “ground rules” for safety and nondetection, 6) review and practice with rating checklist, and 7) preparatory observational visit to the ED. Actors were compensated at a rate of $25/hour for both training and in-ED time.
USP Encounter. Unannounced standardized patients met project staff while residents attended a required conference. USPs were introduced to the triage nurse, the MR administrator, and the attending. The USPs were triaged per standard procedure.
During the encounter, USPs complied with any (noninvasive) exam and accepted all appointments and prescriptions, which were canceled postencounter. If the resident insisted on any course of action that made the USP feel unsafe, the USP was to ask for the attending, send a short message service (SMS) text message to project staff, or simply leave the ED. Hospital billing canceled the visit at the end of the day. Total time in the ED was 1.5 to 4 hours/visit. Immediately following the encounter, the USP debriefed and completed a behaviorally anchored checklist that assessed resident skills and the USP’s satisfaction with the visit.
Post-USP Survey. At the end of the project, all EM residents (including those who did not see a USP; n = 30) were surveyed about their attitudes toward USPs using a four-point scale (1 = strongly disagree, 4 = strongly agree) and open-ended questions. To determine detection rates, residents were asked if they had encountered a USP and if so to identify the USP’s sex and chief complaint.
Unannounced standardized patients assessed residents’ professionalism and communication skills and their satisfaction with the patient-centeredness9,10 of the visit using a three-point scale: “not done,”“partially done,” and “well done.” Scores were calculated as the percentage of well-done items (Table 1). Professionalism and communication skills were scored from 13 items and patient centeredness from eight items. Overall recommendation ratings were obtained using a four-point scale. Reliability estimates (Cronbach’s alpha ‘t’) and descriptive statistics (means, standard deviations [SDs], and ranges) are reported. Correlations (Pearson’s r) between scores earned in the two separate cases are also reported to assess stability of performance.
|Domains of Assessment||Items||Mean, %||±SD, %||Range, %||Reliability|
|Information gathering||Used appropriate questions Clarified information Allowed to talk without interrupting||68||36||0–100||0.82|
|Relationship development||Communicated concern Nonverbal enhanced communication Acknowledged emotions Was accepting/nonjudgmental Used words you understood||62||32||20–100||0.85|
|Education and counseling||Asked questions to see what you understood Provided clear explanations Collaborated with you in identifying next steps||43||31||0–100||0.78|
|Accountability||Disclosed error Personally apologized Took responsibility for situation||49||23||0–80||0.60|
|Manage difficult situation||Avoided assigning blame Maintained professionalism||91||16||60–100||0.85|
|Giving bad news||Prepared you to receive news Gave you opportunity to emotionally respond Provided appropriate next steps||42||34||0–83||0.63|
|Treatment plan and management||Assessed resources Arranged for follow-up Discussed plan||50||39||0–100||0.66|
|Patient Centeredness*||Fully explored my experience Explored my expectations Came to an agreement Took a personal interest in me Earned (regained) my trust Acknowledged impact of error Didn’t make me feel wasting time I was given enough information||43||29||0–75||0.91|
Seventeen of 27 visits were successfully conducted and evaluated. Resident scheduling problems explained most incomplete encounters. Five residents were visited by USPs from both cases, and seven residents from one case.
Seven of 12 residents who encountered a USP provided information on detection; four of their nine encounters were detected (44% detection rate). Five of 18 residents who did not see a USP indicated that they did (28% false-positive rate). One of the residents who reported a false detection reported ignoring that patient.
The reliability of scores (Table 1) suggests adequate internal consistency (α > 0.60). Residents performed better in the misread x-ray case than in the repeat visitor case in professionalism (70% vs. 35%, t = 2.81, p = 0.048) and patient-centeredness (66% vs. 40%, t = 1.96, p = 0.05). Communication (r = 0.73, p = 0.16) and recommendation scores (r = 0.81, p = 0.09) were highly, albeit not significantly, correlated between the two cases, but professionalism (r = 0.24, p = 0.70), and patient-centeredness were not (r = 0.08, p = 0.90), suggesting case content matters most in these domains.
Eighty-three percent of residents who encountered a USP felt that it did not hinder their daily practice and did not make them uncomfortable (86%) or suspicious of patients (71%). A minority of those residents who encountered a USP felt that the encounter improved their practice behavior (14%), made them think more (29%), or led them to be more self-aware (43%).
Our results show that developing and implementing a USP program in the ED is feasible and acceptable to residents. Considering the drawbacks of OSCE assessment and direct observation, combined with increasing demands on faculty time and decreasing funding, USPs may offer an objective, cost-effective method for evaluating accurate practice skills.
The biggest challenge faced while implementing the USP program was the unpredictability of the ED. Occasionally, USPs were mistakenly examined by another resident. Both content (highly trained SP, realistic cases) and logistic factors (dedicated program coordinator, electronic MRs, team collaboration) are necessary for successful integration. Total cost, in terms of both time and money, is likely greater up front, with decreased workload, time, and expense as USPs and staff become trained. Further study of the costs is needed.
Even with a high detection rate, residents reported value in the USP program for learning and patient care. It is possible that informing residents that USPs would be visiting them in the ED improved performance. More importantly, the majority of residents did not feel that the possibility of encountering a USP had any negative impact on their daily practice, suggesting that USPs in the ED will not risk real patient safety. The case of the resident who reported ignoring a patient thought to be “unannounced” represents an unanticipated and anomalous professionalism issue, we believe, not causally related to the use of USPs; it demonstrates how USPs can provide useful information to program directors.
There was a small sample size, with a relatively large proportion of failed USP visits. However, the failure rate improved as the project progressed. Even with our small numbers, it appears that two cases and the items on the behaviorally anchored checklist can discriminate residents based on their communication skills.
With the ACGME placing greater importance on evaluation of patient outcomes, we believe that our project represents a new way to assess real-time resident performance. Despite being time-consuming and subject to the unpredictability of the ED, implementing unannounced standardized patients in the ED is feasible and acceptable to staff. Future comparison of unannounced standardized patients with observed structured clinical exam scores will enable educators to determine how well these methods assess performance in actual practice.
- 1Accreditation Council for Graduate Medical Education (ACGME). ACGME Outcome Project. Available at: http://www.acgme.org/outcome/comp/compCPRL.asp. Accessed Sep 20, 2008.
- 2Accreditation Council for Graduate Medical Education, American Board of Medical Specialties. Outcome Project Toolbox of Assessment Methods. Available at: http://www.acgme.org/outcome/assess/toolbox.asp. Accessed Jun 20, 2009.
- 9The Patient Satisfaction Questionnaire Short Form (PSQ-18). RAND Corporation, Paper P-7865. Available at: http://www.rand.org/pubs/papers/P7865/. Accessed Jun 20, 2009., .