Strategies for remote assessment of medical students at University of Minho

The outbreak of the COVID-19 pandemic was extraordinarily disruptive and presented medical educators with unprecedented challenges. The School of Medicine at University of Minho rapidly moved most curriculum activities to online formats, including our assessment processes. Our experience in remote assessment was very limited when we made the decision, and we knew that there was not enough time for extensive pilot testing. We were confident that we could leverage our overall expertise in assessment design and delivery, and we were representation. Validity associated with PBA requires defining and representing a construct well. When those assessments are shifted from physical to remote virtual environments, construct under-representation is inevitable unless (a) one also aligns elements of a construct, and (b) supplements the V-PBA with processes that supports construct representation and inferences regarding competence. First, in a paramedic high-stakes V-PBA, we aligned dimensions of competence with the modality, assessing four (history gathering, patient assessment, decision making, communication) of seven dimensions (eliminating situation awareness, resource utilisation, procedural skills). Implementation required creating authentic virtual interactions for similar content (i.e., content validity), and re-orienting candidates and raters to the modality (i.e., reducing sources of construct irrelevant variance). Technological solutions to support sampling and security were also needed. Second, the V-PBA was structured to take advantage of in-training and workplace-based assessments prior to the V-PBA and then structured to be supplemented with extensive in-practice onboarding, mentoring and monitoring of performance in work-based settings to complete the construct. This leveraged the in-training and post-V-PBA environments. Our focus at this point was, however, on using V-PBA and the impact of splitting the construct of interest.


| WHAT LE SSON S WERE LE ARNED?
Transitioning performance-based assessments to remote virtual environments may provide assessment designers with an additional stimulus and response format with some caveats. In our preliminary review of the model, faculty reported that reducing or narrowing the construct facilitated a better focus and assessment of included dimensions, that candidates adapted well and that they were able to observe variations in performances. They also believed predictions regarding future performances were possible. They rated their ability to assess included dimensions high (from 3.8 to 4.9 out of 5). Candidates perceived the stimulus format as authentic and that they could demonstrate competencies related to the narrowed construct (rating this from 3.8 to 4.2 out of 5) and better than other forms of simulation. Both groups raised concerns about using raters and standardised actors/patients, and the need for additional assessments to capture other dimensions of performance (e.g. procedural skills). Therefore, splitting the construct to accommodate for and be aligned with a virtual environment may be an appropriate alternative permitted that it is coupled with confirmation of performance ability in other dimensions prior to (e.g., during training) and/or practice-based onboarding/transitions to independent practice that include both formative and further summative assessment and/or monitoring.
In this way, construct representation can be assured and take advantage of training and/or work contexts.

| WHAT PROB LEMS WERE ADDRE SS ED?
Academic integrity is a recognised challenge for the higher education community, even during normal (non-pandemic) times. Universities invest significant resources in preventing and detecting inappropriate behaviours in order to uphold rigorous academic standards and maintain stakeholder confidence in the qualifications they provide. In health professional education, the importance of academic integrity is arguably even more crucial, as breaches have the potential to impact on patient safety. Our particular challenge was to promote and maintain academic integrity while making a rapid transition to remotely invigilated examinations during the COVID-19 pandemic.
able to gather valuable information access to and confidence with web-based instruments through student surveys.

| WHAT WA S TRIED?
Recognising the various critical dimensions for valid remote assessment, the School of Medicine at University of Minho adopted the following parameters for remote assessment: Online examinations will be formative assessments. These examinations are designed to help students measure individual progress, and allow faculty to monitor the effectives of a remotely delivered curriculum and assessment process. There will be adequate student training for the assessment process and prompt feedback on performance.
Item characteristics. Our examinations emphasise integration and problem solving rather than recall. Items are timed to minimise reliance on external sources. Test items are presented in random order to further protect security, and we trust students to act with integrity.
Fairness. Fairness is a key aspect in assessment. As we cannot assure uniformity of testing conditions for remote examinations, and we do not know yet the impact of variable testing circumstances on exam performance, we decided to normalise scores to the average score of the past 3 cohorts in the same examination. This is based on the premise that each cohort of students is quite similar in terms of average performance. Student performance will be calculated as a z-score around average performance of the remote assessment examination and then normalised to a reference score of the past 3 years. This assures that students will not be disadvantaged when compared to previous and future cohorts.
Freedom of choice. Because some students may feel uncomfortable with such format or disagree with the normalising procedure, students will be given the freedom to request a priori, assessment through oral examination without penalty or prejudice.
Transparency and emotional support. Students contributed to the process that resulted in the current assessment model and will be continually consulted throughout delivery. This assures transparency and also significantly decreases emotional distress in a very complex context.

| WHAT LE SSON S WERE LE ARNED?
We consider that this strategy represents good practice in remote assessment and may be transferable to other contexts and learning environments. So far, the evidence gathered indicates that it helped students gain trust in the assessment process and assures that assessment will achieve its main purpose as a critical tool to improve learning.