Emergency Medicine Residency Applicant Views on the Interview Day Process

Authors

  • Nicole M. DeIorio MD,

    1. From the Department of Emergency Medicine, Oregon Health and Science University, (NMD, LMY) Portland, OR; Department of Emergency Medicine, Brown University, (SAG) Providence, RI.
    Search for more papers by this author
  • Lalena M. Yarris MD, MCR,

    1. From the Department of Emergency Medicine, Oregon Health and Science University, (NMD, LMY) Portland, OR; Department of Emergency Medicine, Brown University, (SAG) Providence, RI.
    Search for more papers by this author
  • Sarah A. Gaines MD

    1. From the Department of Emergency Medicine, Oregon Health and Science University, (NMD, LMY) Portland, OR; Department of Emergency Medicine, Brown University, (SAG) Providence, RI.
    Search for more papers by this author

  • CoI: The author reports that there are no conflicts of financial interest.

Address for correspondence and reprints: Nicole M. DeIorio, MD; e-mail: deiorion@ohsu.edu.

Abstract

Objectives:  Emergency medicine (EM) residency programs spend significant time and money offering an interview day experience for their applicants. The day may include a range of activities, although which are most important from the applicants’ point of view are not known.

Methods:  An anonymous web-based survey was sent to all applicants to an EM residency program from the 2006/07 cycle. The study assessed factors about the interview day that were most helpful to applicants in assessing goodness of fit and preparing their rank list of programs.

Results:  When considering whether a program was a good fit for them, the respondents chose (from most to least important) how happy the residents seem, faculty–resident relationships, how well the residents work together, resident and faculty values match my own, the residents spend time together outside of the residency, and the residents share my outside interests.

Applicants most value assessing program “personality,” informal off-campus gatherings with residents, and interviewing with the program director as ways to decide where a program will reside on their rank list. Touring off-campus emergency departments and off-service facilities received the lowest rating averages.

Conclusions:  Residency programs have the opportunity to control two of the three most important ways in which applicants use the interview day to assess programs by offering off-campus gatherings with residents and ensuring that every candidate interviews with the program director. Residency programs may use this knowledge to optimize interview day resources.

Introduction

Emergency medicine (EM) has become an increasingly popular career path of graduating medical students.1 Residency programs select future cohorts of trainees from a competitive pool of highly qualified applicants.1 Thus, they typically expend considerable time and money recruiting residents.

Residency program selection committees have found little guidance in the literature regarding the factors that are important to applicants in selecting an EM residency program. A recent study by DeSantis and Marco concluded that the top five factors that applicants consider when selecting a residency program are friendliness, environment, interview experience, academics, and location.2

We recently conducted a study assessing the factors most important to applicants in selecting a residency program overall. The five factors reported to have the greatest importance were how happy the residents seemed, program personality, faculty enthusiasm, geographic location, and experience during the interview.3 The interview day itself influences four of these top five factors, although these studies did not explore the specifics of the interview day. Therefore, the specific aim of this study was to determine which components of the interview day are most important to applicants to an EM residency.

Methods

Study Design and Population

This was a cross-sectional study regarding the perceptions of EM residency applicants of the interview process. The survey was administered to all applicants who applied to the Oregon Health and Science University EM residency during the 2006/07 residency selection season. This is a 3-year academic West-coast program, with applicants from all over the country and several international medical schools. Our Institutional Review Board granted a waiver of formal written consent.

Survey Content and Administration

The survey included questions to assess the relative importance of several aspects of the interview day that applicants may consider in selecting an EM residency, as well as some other general information about the match process.

Several types of questions were used in the instrument. Applicants were asked to rank from 1 to 7 aspects of programs that they used in considering goodness of fit of a program. Options were how happy the residents seem, how well the residents work together, faculty–resident relationships, program emphasis matched my career goals, resident and faculty values match my own, the residents spend time together outside of the residency, and the residents share my outside interests. To assess how applicants formulated their rank lists, they were asked a question in which they were to rank the five most important aspects of the interview day from 13 choices (Table 1). Both of these questions offered the opportunity to type in free-text comments regarding other items they viewed as important in their assessment of programs.

Table 1. 
Aspects of the Interview Day Deemed Most Important in Formulating the Rank List
Considering the following aspects of the interview day, please rank the FIVE aspects most important to you in formulating your rank list, in order of importance.
 Most important2nd most important3rd most important4th most important5th most importantRating averageResponse count
%, N
Interview with program director6.2 (12)35.2 (68)30.6 (59)17.1 (33)10.9 (21)2.91193
Interviews with faculty members1.9 (3)12.9 (20)25.8 (40)38.1 (59)21.3 (33)3.64155
Interviews with residents5.3 (5)21.1 (20)29.5 (28)23.2 (22)21.1 (20)3.3495
Informal off-campus gathering with residents7.8 (12)41.8 (64)20.9 (32)15.0 (23)14.4 (22)2.86153
Informal off-campus gathering with faculty0.0 (0)0.0 (0)0.0 (0)66.7 (2)33.3 (1)4.333
Meeting residents’ families during informal off-campus gathering7.1 (1)0.0 (0)7.1 (1)21.4 (3)64.3 (9)4.3614
Informal gathering with residents on campus as part of interview day schedule0.0 (0)22.8 (21)37.0 (34)21.7 (20)18.5 (17)3.3692
Tour of the primary ED0.8 (1)13.1 (16)7.4 (9)34.4 (42)44.3 (54)4.08122
Tour of off-campus EDs0.0 (0)0.0 (0)0.0 (0)0.0 (0)100.0 (3)5.003
Tour of off-service facilities (e.g. medicine wards, ICU, etc)0.0 (0)0.0 (0)0.0 (0)0.0 (0)100.0 (5)5.005
Informational slide show1.1 (1)14.6 (13)15.7 (14)25.8 (23)42.7 (38)3.9489
Meeting the program coordinator0.0 (0)25.0 (5)45.0 (9)15.0 (3)15.0 (3)3.2020
Overall “feel” or “personality” of the program88.8 (198)3.1 (7)3.6 (8)1.8 (4)2.7 (6)1.26223

Questions with 4-point Likert scales asked respondents how much they agreed with the statements, “Everyone should interview with the program director,”“The purpose of the interview day should be for the program to sell itself to me,”“…for me to sell myself to the program,”“…for the program to evaluate me as a candidate,” and “…for me to evaluate the program.”

Multiple-choice questions assessed how many programs were applied to, interviewed at, and ranked, as well as participation in the couples match and whether programs other than EM were ranked.

The survey was administered by e-mail using a web-based survey program (http://www.SurveyMonkey.com) using the e-mail address each applicant had provided through the Electronic Residency Application Service (ERAS) and was sent out on the day after the candidates’ deadline for entry of rank lists. Reminder e-mails were sent to all respondents 1 and 2 weeks after the initial survey. No incentives for participation were offered. An information sheet that explained the research intent of the survey and explicitly stated that participation was voluntary accompanied the survey. Responses did not contain any identifying information. The survey was not administered until after the selection committee had formulated and submitted its own list to ERAS ranking the candidates.

Data Analysis

The response average for the goodness-of-fit question was calculated and is reported from the lowest to highest average as most to least important.

With regard to the interview day activity question, a rating average was calculated for each question, and the items with lower numbers were reported as more important.

The 4-point Likert-scale responses to the items addressing interviewing with the program director and the ultimate purpose of the interview day were collapsed into agree and disagree. Descriptive statistics are reported using the Survey Monkey (Menlo Park, CA) software.

Results

Of our 706 applicants, 241 responded to the survey (34% response rate). When asked to consider how they determined whether a program was a good fit for them, the respondents chose in order (from most to least important) how happy the residents seem, faculty–resident relationships, how well the residents work together, resident and faculty values match my own, the residents spend time together outside of the residency, and the residents share my outside interests.

Applicants were also asked the five aspects of the interview day most important to them in formulating their rank list. The top five responses, in order of rating average on a scale of 1 to 5, were overall “feel” or “personality” of the program (average 1.26), informal off-campus gathering with the residents (2.86), interview with the program director (2.91), meeting the program coordinator (3.20), and interviews with residents (3.34). Touring off-campus emergency departments and off-service facilities received the lowest rating averages (5.00 for both). A list of all the choices and their averages is shown in Table 1.

When respondents were asked to write in other factors that are important in assessing goodness of fit or in ranking a program, a number added that the level of organization of the day was an important surrogate marker of the program (n = 20/86, 23%).

Ninety-six percent of respondents (226/235) agreed strongly or somewhat with the statement that everyone should interview with the program director.

In assessing what applicants believe to be the real purpose of the interview day, we asked them to evaluate certain statements. The statements that achieved the most agreement (highest percentages of respondents selecting agree strongly or agree somewhat), from most to least, were for me to evaluate the program (99.6%, 234/235), for the program to evaluate me as a candidate (96.1%, 221/235), for me to sell myself to the program (81.3%, 191/235), and for the program to sell itself to me (90.2%, 212/235). Tables 2 and 3 report other aspects of respondents’ match experiences.

Table 2. 
Number of Interview Experiences
 0–56–1011–1516–20>20
%
EM programs applied to1.74.89.620.063.9
EM places interviewed10.439.140.09.11.3
EM programs ranked14.446.531.37.80.0
Table 3. 
Match Demographics
 YesNo
%
Couples match?7.093.0
Ranked non-EM programs?5.294.8

Discussion

Although applicants to our EM program were least likely to agree with the statement that the purpose of the interview day was for the program to sell itself to them, programs devote significant resources to this activity during recruitment season.

In our study, EM applicants placed a high priority on the “personality” of the program when deciding how to order their rank lists, although this is nebulous and difficult for programs to control when trying to showcase themselves. However, program directors should include other elements shown to be important, as well, such as a mandatory interview with the program director, meeting the program coordinator, and interviewing with current residents. This is consistent with a survey of former neurology applicants, in which Adair et al. discovered that interviews with faculty, interaction with current residents, and interviews with residents, in that order, made the most enduring impressions.4 Also, although the areas of the program we chose to ask about with regard to goodness of fit were all intangibles, program directors could highlight these aspects of their own program by showing candid photographs in the informational slide show and by discussing examples of current residents who may share an applicant’s interests and background during individual interviews.

Although we did not collect data regarding respondents’ strengths as candidates, Table 2 can provide applicants and their advisors with some guidance regarding numbers of programs to apply to, interview at, and rank by considering these national averages.

Limitations

Regarding the question assessing interview day activities that they found most valuable, we chose to report the calculated rating average, although because the calculated rating average did not contain any weighting reflecting how many respondents chose each option, an activity could be over-represented in our results if a small number of respondents chose it but ranked it highly. Because this was just one way to interpret the data, we are reporting the entire data set, including the response counts for each item, so that readers can draw their own conclusions (Table 1).

Our findings may not be generalizable to applicants who did not apply to our program or who did not respond to our survey. Given that 66% did not respond, response bias could have occurred, although in the 2006/07 cycle, we received applications from 706 of the 1,669 applicants to EM programs as reported by the National Resident Matching Program;5 therefore, our program attracts a large percentage of applicants. It is difficult to postulate how responders might be different from non-responders because we did not collect data on the non-responders, although one could posit that non-responders may be somehow less invested in improving the interview process, perhaps because of fatigue from multiple interviews or disenfranchisement from unmet expectations with the process.

We attempted to survey all applicants to our program whether or not they were offered an interview. An unexpected finding was the surprising number of write-in comments expressing dismay that we would survey them when we did not offer them an interview. This probably accounted for some of the discrepancy between the number of respondents and meaningful surveys returned.

It may have been valuable to be able to analyze responses and response rates based on whether the respondent had been offered an interview at our program, but because a main concern in study design was to ensure applicants of maximal confidentiality, we did not attempt to link responses with particular categories of applicants. Because applicants are at such a vulnerable point in the interview process, we did not want to give the appearance of collecting any data that may have resulted in them fearing repercussions on their rank list position by our program.

We elected to survey a broad range of applicants (those who did not receive an interview invitation from us, those who were offered and declined to interview with us, and those who interviewed with us) to gather opinions from a diverse group consisting of hundreds of people and improve generalizability. However, this breadth of subjects also means that the responses collected could have been from applicants who were very different and had had a wide variety of experiences.

A future study of a similar type could ask respondents to self-report whether they had or had not been offered an interview at the study site and how many interviews they had been on, or a multi-site study could be developed wherein only applicants who had interviewed at the study site(s) were surveyed, allowing for collection of powerful data from a more-heterogeneous group.

Conclusions

Certain aspects of the interview day, namely interviewing with the program director, meeting current residents in an interview setting or informally off-campus, and meeting the program coordinator, were high yield for applicants in formulating their rank lists. Residency programs may not want to expend the time and effort required to transport applicants to off-campus training sites, because this was shown to be less valuable to candidates. Programs can use this information to plan where to put their time and financial resources when considering how to give applicants the most important information and exposure to their residency.

Ancillary