Nursing staff assessment of residents’ professionalism and communication skills
Gary Sutkin, University of Pittsburgh, Magee-Womens Hospital, 300 Halket Street, Room 2326, Pittsburgh, Pennsylvania 15213, USA. Tel: 00 1 412 641 1440; Fax: 00 1 412 641 1133; E-mail: firstname.lastname@example.org
Context and setting
Significantly positive or negative professionalism and behaviours related to interpersonal and communication skills (ICS) occur relatively infrequently in resident training. Meaningful in-training assessment can come from observation of resident behaviour in the actual work environment. Despite calls for 360-degree evaluation of residents by the Accreditation Council for Graduate Medical Education (ACGME), few programmes routinely use resident performance assessment (RPA) by nursing staff.
Why the idea was necessary
We believe nursing staff observe numerous resident–patient and resident–staff interactions and can identify residents who are performing seriously below or above standards. Our objective was to involve nurses in a collaborative effort to create and pilot-test a professionalism and ICS rating form.
What was done
A list of 17 global rating items was distributed to 185 registered nurses, licensed vocational nurses and nursing technical staff in four clinical sites (operating room, outpatient clinic, surgical and postpartum floor, and labour and delivery area). Forty-seven nursing staff (25%) returned the list, which was condensed to a 10-item RPA form with questions such as: ‘Does the resident listen to and consider what you have to say?’ and ‘Is the resident courteous to patients and their families?’ The final 10-item RPA form was distributed to all 185 nursing staff so they could evaluate all 12 obstetrics and gynaecology residents on two separate occasions (test–retest), 2 weeks apart.
Evaluation of results and impact
Individual residents were evaluated by a mean of 36 nurses on each occasion. Although return rates were only 30% and 25%, many nurses told us they appreciated being part of the assessment process. Pearson correlations, measuring test–retest reliability over a 2-week period, were very stable (0.65–0.85) for nine of the 10 items. The generalisability of individual resident mean scores across 28 nurses was moderate (r = 0.39).
One-way anova and Tukey HSD (honestly significant differences) testing performed on scores for 10 items from 55 forms identified Resident 3 as a low-score outlier on nine of 10 items and Resident 12 as a high-score outlier on eight of 10 items. The status of these two outliers was not surprising to the clinical faculty.
Disturbingly, when asked ‘Do you believe that the nursing staff are appropriate evaluators of resident professionalism and ICS?’, 67% of residents stated they did not believe nursing staff were appropriate evaluators of resident professionalism and ICS.
We received numerous written comments from the nurses, which proved to be informative for both the residents and ourselves. Examples included: ‘He is excellent, caring, honest, respectful;’‘At times he treats us disrespectful[ly]. Does not listen even when [the nurse is] proven right;’‘I find her sometimes to be brash with patients,’ and ‘He sometimes refers to the patients as “sweetie” or “honey”.’ Narrative comments can serve as an aid in the counselling of residents who might require corrective action or positive recognition.
We were pleased that the nursing staff were willing to participate in the resident assessment process. It is vital that problems with professionalism and ICS are documented and addressed during training, and we believe that nursing staff can provide useful assessments in a variety of residency training settings.