SEARCH

SEARCH BY CITATION

Keywords:

  • multicentre study [publication type];
  • clinical competence/*standards;
  • education, medical, undergraduate/*methods;
  • remedial teaching/*methods;
  • schools, medical;
  • United States

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

Objective  Most US medical schools conduct comprehensive clinical skills assessments during Years 3 and 4. This study explores strategies used to identify and remediate students who perform poorly on these assessments.

Methods  In the academic year 2005–06, we conducted 33 semi-structured interviews with individuals responsible for standard setting in and remediation after their schools’ comprehensive clinical skills assessments. We coded interviews to identify major themes.

Results  Prior to remediation, some schools employed a ‘verification’ step to ensure the accuracy of the failing score or need for remediation. Participants described a remediation process that included some or all of 3 steps. Firstly, students’ specific learning deficits were diagnosed. Next, students participated in remedial activities such as performance review sessions or practice with standardised or actual patients. Lastly, students were re-tested, usually with a shorter, more formative examination. All participants reported using a diagnostic step, most offered or required remedial activities and many re-tested, although schools varied in the emphasis placed on each step. Many participants cited the individualised attention students received from remediation faculty staff as a strength of their approach, although they raised concerns about the substantial time demands placed on remediation faculty. Most respondents reported some dissatisfaction with their school’s remediation process, particularly uncertainty about efficacy or rigour.

Conclusions  Schools vary in the intensity and scope of remediation offered to students who perform poorly on clinical skills assessments. Although many schools invest significant resources in remediation, the effect of these efforts on students’ subsequent clinical performance is unknown.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

To prepare students for residency training and clinical practice, medical schools need to identify and remediate students who have not achieved clinical skills competency. Unfortunately, multiple studies have shown that faculty staff rarely observe students’ clinical skills during clerkships or provide feedback based on observed patient encounters.1–3 Moreover, supervising clinicians are often reluctant to fail a trainee or even document poor performance in written evaluations, despite feeling able to identify skill deficits.4 This reluctance is driven by lack of remediation options, insufficient documentation of deficiencies to support a failing grade, and fear of trainee appeal.5 Hence, students may complete their clinical rotations without mastering defined competencies and poor performers may never receive adequate feedback on skills that need improvement.

Most medical schools in the USA now administer an interdisciplinary comprehensive assessment using standardised patients (SPs) presenting common, usually ambulatory, medical problems at the end of core clerkships.6 This examination provides an opportunity to identify and remediate students with skill deficits before they graduate from medical school. Passing an in-house comprehensive assessment is increasingly used as a criterion for graduation or promotion, and performance on such examinations has been correlated with subsequent performance during internship and after training.7,8 The addition of an SP component to the US Medical Licensing Examination further motivates medical educators to ensure that trainees master core clinical skills.9

This study explores the strategies educators use to remediate students after a comprehensive clinical skills examination. We sought to understand the consequences of failing the examination and how educators determine whether students’ skills have improved to an acceptable level after remediation. We describe participant perceptions of the strengths and weaknesses of their remediation efforts, and propose a framework for remediation based on the strategies study participants described.

Methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

Curriculum deans at 62 US medical schools with comprehensive clinical skills assessments provided us with the names of individuals responsible for standard setting on and remediation after their schools’ examinations.6 A comprehensive assessment was defined as a multi-station, cross-disciplinary examination involving SPs and occurring outside any single clerkship. We randomly selected individuals from the list of 62 remediation faculty and issued invitations to be interviewed. Potential subjects received up to 2 invitations.

Between August 2005 and July 2006, 1 investigator (KMK) conducted semi-structured telephone interviews lasting 30–60 minutes with enrolled subjects. The interview instrument developed by the investigators included open-ended questions about strategies used to identify students needing remediation, selection of remediation faculty, remediation activities, retest requirements, academic consequences of failing the examination, and strengths and weaknesses of the remediation process (see Appendix).

Interviews were recorded and transcribed verbatim. Three investigators (KEH, KMK, AT) coded transcripts using open and axial coding methods. Coding was performed over the year, as transcripts became available. We used atlas.ti Version 5.0 software (Scientific Software Development GmbH, Berlin, Germany) to organise and retrieve coded data. Information describing the structure of each school’s remediation programme (range of remedial activities, presence of re-test, etc.) was summarised in tables. All 5 investigators reviewed the coded data to identify themes and analysed the summary tables to identify common approaches to remediation. The large and representative sample, use of multiple researchers and careful review of transcripts all contribute to the study’s validity.

We use frequencies when reporting the structure of participating schools’ remediation programmes (e.g. number of respondents identifying preceptorships as a remediation activity). Interviewee perceptions about the efficacy of their schools’ programmes and the factors that influenced programme design and functioning are presented as themes.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

We contacted 53 of the 62 potential subjects and interviewed 33. Four invitees declined or were ineligible: 2 did not respond, 1 declined, and 1 was excused when it was determined that the represented school’s comprehensive examination did not meet study criteria. The remaining 16 invitees indicated interest in participating but were not interviewed because thematic saturation was reached, meaning that no new themes were emerging from the data. The 33 participants, their schools and their comprehensive examinations are described in Table 1.

Table 1.   Participant characteristics
 No.%
  1. AAMC = Association of American Medical Colleges; SP = standardised patient

Respondents
 MDs2473
 PhDs515
 Other412
 Examination director2267
 Dean (education,   evaluation, curriculum)721
 Other educational role412
Schools
 Private1133
 Public2267
 US geographic region (AAMC)
  Central721
  Northeast1030
  Southern1030
  Western618
Examination characteristics
 Years in place
  1–3618
  4–6927
  7–101030
  11–14515
  >1439
 Scoring
  Checklist completed by SP2679
  Faculty checklist/global assessment721
 Grading
  Normative (comparison to peers)2061
  Criterion referenced     (comparison with pre-defined criteria)721
  Combined normative-criterion     referenced618

Score verification

Seven participants volunteered that their schools took steps to verify that low-scoring students actually performed poorly and needed remediation. Verification consisted of reviewing score reports and a video-recording of the student’s examination. The examination director typically performed this function, either alone or with a clinical skills examination oversight committee. Respondents described instances where scoring errors or flawed checklist design artificially lowered student scores:

‘We have at least 2 clinicians viewing the videotapes [of student–patient interactions] of those bottom 15% of scores, viewing the checklists, and trying to determine is there an issue here or was this an anomaly...’

Remediation process

Participants described a 3-step remediation process consisting of diagnosis, remedial activities, and re-testing (Fig. 1.) Not all 3 steps were included in every school’s remediation programme and participants used a variety of approaches to each step.

image

Figure 1.  Model of the remediation process. SP = standardised patient

Download figure to PowerPoint

Diagnosis.

All participants described a diagnosis step, when score reports and often videos were reviewed to identify students’ specific skill deficits. The persons responsible for diagnosis were commonly the examination director (12 schools), a member of an examination or academic oversight committee (9), a dean (5) or other clinical faculty (7). One examination director described the diagnostic process:

‘With students who do poorly I watch most of the encounters. We have access to what the SPs put on their checklists [the SP’s rating of the student’s performance]. I ask the students to note at the close of each encounter what they think their strengths were, and what they wish they had done differently, so we’ve got a contemporary reflection by the student about what went well and what went poorly.’

At 2 sites, the diagnostic process incorporated assessment data from other sources, such as clerkship evaluations, to allow for more comprehensive characterisation of deficits. Fourteen schools involved the student in identifying deficiencies, and at 7 schools the student did the initial diagnostic work by independently reviewing the scores and video and identifying areas of weakness. Two schools allowed students to review a recording of a gold-standard student performance on the same case. Usually, self-assessments were followed by sessions in which the same information was reviewed with a faculty member.

Remedial activities

Remediation activities varied widely, from reflection and goal setting in a student–faculty meeting to practice with standardised or actual patients. Most schools developed individualised remediation plans for each student, although 6 schools employed a uniform approach for all students or all students with certain types of deficits. The most common remediation strategies were individualised performance reviews, practice with SPs and preceptorships. These and other activities are outlined in Table 2.

Table 2.   Remedial activities performed after a comprehensive clinical skills assessment
Type of activityExamples
  1. SP = standardised patient

Individualised performance reviewExamination director reviews videotape with student and offers strategies for improvement
Practice with standardised patientsStudent conducts SP encounters focused on clinical problems that will challenge student to improve on deficits; SPs and faculty observers provide feedback during or after the encounters
Clinical observation and feedbackPreceptorships: a preceptor, selected for teaching reputation and/or educational role in curriculum, hosts student in clinic; number of sessions may vary based on preceptor impressions of improvement Observed encounters in clerkships: examination director asks clerkship faculty to observe student performing certain skills and to provide student with instruction and feedback; clerkship director reports on student progress Assigned clerkship: student is assigned to clerkship selected to maximise opportunities to practise deficient skills Clinical skills clerkship: student participates in clinical skills course specifically designed for senior students who struggled with SP examination
Other activitiesIndependent work: directed reading, web-based modules, or practice in interpreting diagnostic test results Counselling: referrals to a counsellor or psychiatrist to address communication problems (e.g. poor eye contact, odd affect) or professionalism issues Clinical knowledge tutoring: directed reading, clinical content testing, or faculty tutoring to address problems linked to clinical content knowledge deficits SP trainer sessions: to address minor physical examination technique deficits Group activities: workshops or discussion sessions focusing on clinical reasoning
Individualised performance review

A total of 29 participants conducted an optional or required performance review session, in which students met with the examination director or a designated faculty member to discuss examination results. Such sessions often combined diagnostic activity, where student and instructor explored reasons for the poor examination score, with some instructional activity aimed at improving student performance. Instructional activities included physical examination drills, practice with communication techniques, and tips on test-taking strategies. The performance review was usually the first step of the remediation process, although at 2 schools the session was the only required remedial activity. One participant explained:

‘The students have an appointment with me… and we go over their deficits and offer them further opportunities to work on them if they want. But in most instances, just being aware of what was going on helps the individual considerably.’

Participants using this approach placed the responsibility for addressing deficits on the students, who might be advised to engage in self-directed remedial activities throughout Year 4. At 7 schools, faculty supplied students with written ‘learning prescriptions’ that included analysis of deficits and recommendations for change.

Practice with SPs

Fourteen schools employed optional or required practice with SPs, either in individual sessions, a formative clinical skills examination, small-group workshops, or even a month-long clinical skills clerkship. These experiences were structured as teaching sessions, with faculty preceptors or experienced SPs giving feedback. One participant described such an experience:

‘The student would see a SP case and then come out, then the faculty would go in with the student, and instead of writing a post-test, they were working on what’s going on in the room. It was really remarkable because the students changed their behaviour. They would build off what they were learning each time.’

Clinical observation and feedback

A total of 24 schools recommended or required participation in preceptorships or similar clinical observation and feedback experiences (Table 2). Remediation preceptors were typically selected because of their role in the curriculum, teaching expertise, or availability. The examination director, examination committee or a dean usually recruited the preceptors, although at one school the student was responsible for this task.

The amount of clinical time spent with the preceptor varied from a few sessions to enrolment in a month-long rotation. Sometimes a particular number of sessions was prescribed upfront, although it was equally common for the sessions to continue until the preceptor felt the student’s deficits had been resolved. Several participants acknowledged that their process was highly variable:

‘We’ve been very loose. The faculty preceptor could say, “Okay, from this observation do you understand what you needed to do better?”“Yeah.” That ends that.’

Ten schools had policies allowing the examination director to modify failing students’ Year 4 schedules to accommodate assigned remediation preceptorships. One participant reflected:

‘We have remediated them through choices of fourth-year clerkships where we think that they’ll get more exposure to certain problems.’

Nine other schools folded remediation into scheduled clerkship or elective experiences:

‘They don’t have to give up elective time… most of the electives they have, they are able with normal patient contact to do a lot of these remediation steps. They incorporate it into what they’re already doing day in and day out.’

At schools using this approach, clerkship directors were notified about the student’s specific needs and were asked to provide opportunities for observation and focused feedback. At 7 schools, clerkship directors and preceptors were required to document observed encounters or report on student progress.

Retesting

A total of 24 schools re-tested students. The re-test requirement was based on the original examination score or on a remediation preceptor’s discretion. The re-test was typically shorter than the original examination and commonly included 1–4 stations. At 3 schools the re-test consisted of an SP examination designed for another purpose, such as a formative examination for more junior students.

Several participants acknowledged that their re-test was easier than the original examination, and none stated that the re-test was more complex. Six schools employed faculty observers at the re-test to perform global assessments of competency and provide feedback to students during or immediately after the examination. At 4 schools the re-test was designed to assess the individual remediation student’s specific deficits. Eight participants whose schools did not offer a re-test reported that resources (time, money, space) precluded retesting.

Consequences of failing the comprehensive assessment

At 20 schools, students who failed the comprehensive assessment were required to pass a re-test to graduate. At 6 schools, participation in remediation alone was required for graduation. Five schools documented failing performances in the transcript or the dean’s medical school performance evaluation for residency. Four schools had no consequences for failing. Two of these offered a menu of optional remedial activities and the other 2, both small schools with strong advisory dean systems, perceived the examination as a formative experience. At these sites all students, not just those who failed, reviewed scores and created individualised learning plans based on examination results.

Six participants described increasing the stakes of the examination over time, whereby they moved from an optional to a required examination, and subsequently introduced requirements that students who failed the examination participate in remediation or pass a retest as a condition of graduation. Some adopted a practice of documenting examination failure on the student’s transcript. Participants linked this escalation in the academic consequences of failing the examination to the school’s confidence in and experience with administering the assessment. As one participant described:

‘It has taken us a while to start having real confidence in the quality of our measures. If you don’t have confidence that your measures are reliable and valid, it makes it really hard to make a case for saying this student really needs to do a lot of extra work, or this student needs to be not cleared for graduation.’

Strengths and weaknesses of the remediation process

Participants frequently identified the amount and quality of faculty time dedicated to remediation as a major strength of the process. Interviewees perceived that students greatly appreciated the individual attention and detailed feedback they received during remediation. One participant explained that students reported ‘…how amazing this was for them’ and that ‘no-one before in their life had ever given them that kind of feedback so directly and so consistently’. As a rule, remediation faculty were described as dedicated, skilled teachers who utilised a positive, supportive approach.

Unfortunately, some schools had difficulty securing the faculty time needed to offer individual attention to remediation students. The challenge of finding remediation faculty and the intensity of the time commitment required of these individuals, who were often volunteers, were commonly cited weaknesses of the remediation process. One participant acknowledged that faculty availability influenced the scope of the remediation process:

‘There is a tension between wanting to be rigorous and wanting to go for the ideal, and then recognising that the clinical faculty have very little time. Because we’re asking them to do a favour, we’re trying to not make it really burdensome.’

Many participants confessed uncertainty about the rigour and efficacy of their remediation process. For some schools, the problem was administrative, as participants questioned whether the assigned remediation actually happened. For others, the challenge of standardising preceptorships and other clinical experiences was perceived as a weakness:

‘Much of the difficulty in clinical medicine is the students do not see the skills reinforced very well when they’re in a less controlled setting… sometimes what they see is good role modelling, and sometimes what they see is not.’

Participants raised concerns about lack of rigorous outcome data regarding both the re-test, which was often too short to yield reliable data, and post-remediation clinical performance. Participants lamented that they could not know what students would do in actual clinical settings after remediation had been completed:

‘Is the goal to help them pass the clinical skills exam or is the goal to turn out somebody that you think is going to consistently communicate well with patients?’

Schools that left the decision about whether and how to remediate deficits to the students viewed this strategy as a weakness:

‘The best that we do currently is to tell them to be sure when they do rotations to work on those specific skills and to be observed. We leave the ball in the student’s court, so it’s just not good enough. We need to have more of a programme.’

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

Our participants described a remediation process that included diagnosis of the learner deficits, remedial activities and re-testing (Fig. 1). Schools varied in the degree to which they emphasised each step, and some schools varied their approach to accommodate individual student deficits or preferences. This variety and flexibility typifies remedial education in other undergraduate and professional settings.10–12

Many participants cited individual faculty attention as a major strength of their remediation programmes, an observation that is consistent with literature showing that remedial education has the greatest impact on the learner when it is individualised, highly interactive and delivered in a meaningful context.13 Students’ motivation can be maximised by providing choice about some aspects of the remediation plan, as some of our participating schools did, and by creating realistic activities, as our participants did by incorporating standardised or actual patient experiences into their remediation programmes.13

Several participants admitted uncertainty about the efficacy of their schools’ remediation processes. Although a majority of participating schools re-tested failing students, most re-tests were easier than the original examinations. The absence of robust re-test examinations may reflect decisions to allocate limited resources to instructional activities that would benefit more students, or to rely on licensing examinations to determine competency. Some participants, even those with required re-tests, remained uncertain of how remediation students would perform in actual student−patient interactions.

Some participants described a pre-remediation verification step that uncovered errors in case design or scoring. Need for verification could be obviated with improvements in checklist design and adoption of techniques that increase scoring reliability.14 However, the often high-stakes nature of the clinical skills assessment justifies devoting effort to assuring scoring accuracy. Our findings suggest that, particularly with recently developed examinations, the verification process improved examination design and enhanced student and faculty confidence in examination results. Further research exploring curriculum deans’ perspectives about the validity of examination scores compared with other performance measures should be undertaken.

This study has limitations. Interviewees’ perceptions might differ from those of other educators at their institutions, and we did not obtain student perspectives. Only 27% of US medical schools were represented in our sample, so our findings may not generalise to all medical schools. However, SP assessments are used widely and internationally, and the challenges of ensuring competence15,16 and enhancing performance17,18 are not unique to US medical schools. Our proposed framework for remediation would be applicable across clinical skills testing settings within and outside the USA. Further, our sample size was large for a qualitative study, and we reached saturation within our sample.

Given the wide variation in approaches to remediation, and our participants’ uncertainty regarding the efficacy of their techniques, additional research is needed to determine the effectiveness of different remediation strategies in improving clinical performance longitudinally, and to obtain students’ perspectives on different remediation activities. Future research could also determine which strategies are most effective for specific skill deficits, and how different approaches can be adapted to address individual student needs. Insights gained from such inquiries would go a long way toward ensuring that all trainees develop into successful clinicians.

Contributors:  all authors made substantial contributions to the study conception and design, and/or the acquisition, analysis or interpretation of data, and the drafting or critical revision of the article. All authors approved the final version of the manuscript.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

Acknowledgements:  the authors acknowledge the interviewees for their contributions.

Funding:  this study was funded by the Josiah Macy Junior Foundation.

Conflicts of interest:  none.

Ethical approval:  this study was approved by the University of California, San Francisco Institutional Review Board.

Overview

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

What is already known on this subject

In the final year of medical school, standardised patient comprehensive assessments identify a small number of students who have not achieved clinical skills competency.

What this study adds

Medical school faculty charged with directing clinical skills remediation after their schools’ comprehensive assessments described a 3-step remediation process that incorporates diagnosis of learner deficits, remedial activities such as clinical preceptorships or practice with standardised patients, and retesting. Remediation consumes substantial time and resources. Outcomes of remediation are unclear, which reflects a lack of both rigorous retesting and longterm follow-up.

Suggestions for further research

Research into the effectiveness of different remediation strategies might guide faculty in optimising remediation programmes for medical students.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix
  • 1
    Kassebaum DG, Eaglen RH. Shortcomings in the evaluation of students’ clinical skills and behaviours in medical school. Acad Med 1999;74:8429.
  • 2
    Howley LD, Wilson WG. Direct observation of students during clerkship rotations: a multi-year descriptive study. Acad Med 2004;79:27680.
  • 3
    York NL, Niehaus AH, Markwell SJ, Folse JR. Evaluation of students’ physical examination skills during their surgery clerkship. Am J Surg 1999;177 (3):2403.
  • 4
    Speer AJ, Solomon DJ, Fincher RM. Grade inflation in internal medicine clerkships: results of a national survey. Teach Learn Med 2000;12 (3):1126.
  • 5
    Dudek NL, Marks MB, Regehr G. Failure to fail: the perspectives of clinical supervisors. Acad Med 2005;80 (Suppl 10):847.
  • 6
    Hauer KE, Hodgson CS, Kerr KM, Teherani A, Irby DM. A national study of medical student clinical skills assessment. Acad Med 2005;80 (Suppl 10):259.
  • 7
    Tamblyn R, Abrahamowicz M, Dauphinee WD, Hanley JA, Norcini J, Girard N, Grand'Maison P, Brailovsky C. Association between licensure examination scores and practice in primary care. JAMA 2002;288 (23):301926.
  • 8
    Taylor ML, Blue AV, Mainous AG III, Geesey ME, Basco WT Jr. The relationship between the National Board of Medical Examiners’ prototype of the Step 2 clinical skills exam and interns’ performance. Acad Med 2005;80 (5):496501.
  • 9
    Federation of State Medical Boards (FSMB) and National Board of Medical Examiners® (NBME®). United States Medical Licensing Examination Step 2 Clinical Skills (CS) Information. Dallas, TX; Philadelphia, PA. http://www.usmle.org/step2/Step2CS/Step2Indexes/Step2CSIndex.htm. [Accessed 1 February 2007.]
  • 10
    Perin D. Promising approaches for remediation. Community Coll J 2001;72 (1):536.
  • 11
    Forrest L, Elman N, Gizara S, Tammi Vacha-Haase T. Trainee impairment: a review of identification, remediation, dismissal, and legal issues. Couns Psychol 1999;27 (5):62786.
  • 12
    Boylan HR. Exploring alternatives to remediation. J Dev Educ 1999;22 (3):28.
  • 13
    Johnson GM. Constructivist remediation: correction in context. Int J Spec Educ 2004;19 (1):7288.
  • 14
    Huber P, Baroffio A, Chamot E, Herrmann F, Nendaz MR, Vu NV. Effects of item and rater characteristics on checklist recording: what should we look for? Med Educ 2005;39 (8):8528.
  • 15
    Stern DT, Ben-David MF, De Champlain A, Hodges B, Wojtczak A, Schwarz MR. Ensuring global standards for medical graduates: a pilot study of international standard-setting. Med Teach 2005;27 (3):20713.
  • 16
    Roberts C, Newble D, Jolly B, Reed M, Hampton K. Assuring the quality of high-stakes undergraduate assessments of clinical competence. Med Teach 2006;28 (6):53543.
  • 17
    Hodges B, Regehr G, Martin D. Difficulties in recognising one’s own incompetence: novice physicians who are unskilled and unaware of it. Acad Med 2001;76 (Suppl 10):879.
  • 18
    Junger J, Schafer S, Roth C, Schellberg D, Friedman Ben-David M, Nikendei C. Effects of basic clinical skills training on objective structured clinical examination performance. Med Educ 2005;39 (10):101520.

Appendix

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. Overview
  9. References
  10. Appendix

Appendix: Semi-structured interview questions

Please describe your school’s Clinical Skills Assessment Examination.

  • • 
    Number of stations
  • • 
    Length of each station
  • • 
    Timing in the curriculum

What is your role in your school’s Clinical Skills Assessment Examination?

  • • 
    How long have you been in this role?
  • • 
    What percentage of your time do you spend on the programme?

How do you score the examination?

  • • 
    Method of scoring
  • • 
    Method of determining pass/fail: normative/criterion referenced
  • • 
    Implications of failing examination: what do students need to do to meet the requirement?

How is the school’s remediation process structured?

  • • 
    Which students receive remediation?
  • • 
    Which faculty members administer the remediation? How are they selected?
  • • 
    When does the remediation occur?
  • • 
    In what format?
  • • 
    Is retesting required?

How successful is the remediation process?

  • • 
    Does performance improve after remediation?
  • • 
    Do you have any student feedback?
    •  • 
      Do students find it helpful? Why or why not.
  • • 
    How confident are you in the process your school employs?
    •  • 
      What are the strengths and weaknesses of your current process?

What are your school’s future plans for remediation?

  • • 
    How could the remediation process be improved?