Assessment in a problem-based learning course: Twelve tips for constructing multiple choice questions that test students' cognitive skills

Authors

  • Samy A. Azer

    Corresponding author
    1. School of Medicine, Dentistry and Health Sciences, FEU, University of Melbourne, Victoria 3010, Australia
    • School of Medicine, Dentistry and Health Sciences, FEU, University of Melbourne, Victoria 3010, Australia. Tel.: +61-3-83448035; Fax: +61-3-83440188
    Search for more papers by this author

Abstract

For over 35 years, multiple choice questions (MCQs) have been used in the assessment of students in medical, dental, nursing, and physiotherapy courses. MCQs enable educators to test a broad range of topics in the curriculum. However, the educational goals of MCQs used currently are to test factual knowledge rather than a deeper understanding or use of information. They do not focus on testing cognitive skills, and many of them test small print in textbooks. With the introduction of problem-based learning in medical and health professional courses and a full shift from a traditional lecture-based curriculum to a student-centered program, many schools are currently reviewing their assessment tools and introducing new strategies that reflect the philosophy of the new curriculum. However, there is limited information in the literature to guide question writers and provide them with guidelines to construct new styles of MCQs that test higher-order thinking skills. This article provides 12 practical tips to help question writers in creating MCQs that test cognitive skills.

Problem-based learning (PBL)11 has been widely used in medical, dentistry, physiotherapy, occupational therapy, speech pathology, and allied health curricula for 20–30 years. The educational principles on which PBL courses are designed are: the need to reduce the factual knowledge in the course; to enhance the self-directed learning and communication skills of the students; to encourage integration of basic and clinical sciences; community-related issues; and small-group learning [1]. With the change in the curriculum, assessment needs to match the philosophy of the curriculum and reflect its educational outcomes. The approach of assessment requires more complex outcomes, not just the ability of the examinees to recall information. Multiple-choice questions (MCQs) have been widely used in assessment in medical schools and other health professionals' institutes in the United States, the United Kingdom, Europe, and Australia for over 35 years. However, MCQ examinations have been viewed to be of limited value for PBL courses for the following reasons: a) The use of MCQs in assessment will interfere with the students' learning process and force them to focus on details in lectures and textbooks rather than the desired skills imbedded in PBL; b) traditional MCQ examinations do not match with the philosophy of the course; and c) traditional MCQs have limited validity for measuring the application of knowledge, and most questions ignore the whole picture [2]. Although there have been several suggestions to improve the quality of MCQs [3, 4], none of these suggestions have enhanced the power of MCQs to test cognitive skills or assess competence.

Well-constructed MCQs may demand a greater deal of analytical thinking [5], enabling examiners to test integration of knowledge, problem solving skills, and application of knowledge. Recently, the use of extended matching questions has been implemented in a number of schools [6], and conversion of the stem of questions into a clinical vignette. The use of a scenario instead of a brief statement in the stem may help in constructing questions that focus on specific educational objectives. For example, a scenario allows the examiners a) test students' abilities to assess the significance of key words provided in the scenario; b) test students' abilities to generate hypotheses, integrate information, and look for supportive evidence for each hypothesis; c) test higher level thinking and the use of information rather than rote learning; and d) to write questions free from grammatical clues. This article provides 12 practical tips to help question writers in creating MCQs that test cognitive skills.

TIP 1. ENSURE THAT QUESTION WRITERS UNDERSTAND THE PHILOSOPHY OF THE CURRICULUM

Authors of questions should be aware of changes introduced in the curriculum and the key principles of the new course. For example, the approach used in a PBL course is student-centered and aims at the following educational objectives:

  • Enhancement of students' skills to acquire principles and key concepts that are better retained by the learners and allow them to apply information learned in other similar situations;

  • Development of students' reasoning skills, critical thinking, and decision-making strategies;

  • Preparation of students to pursue life-long learning;

  • Promotion of small group learning and the need for effective team -work and collaborative learning;

If you are planning to lead a small team to write MCQs for a summative or formative assessment, consider the following:

  • Your team should include representatives from each discipline taught in the block/semester plus two medically qualified members to help in writing the case scenarios;

  • Ensure that your team is aware of the structure and design of the curriculum and the philosophy of assessment;

  • Provide them with guidelines regarding the different components of assessment, the tools to be used, the number of MCQs required, and examples of the style of questions to be used;

  • Discuss areas to be covered in MCQs and integration and allocate roles;

  • Discuss the 12 tips with your team and provide them with a copy.

TIP 2. WRITE QUESTIONS THAT ADDRESS SPECIFIC EDUCATIONAL OBJECTIVES

On the basis of the changes in the curriculum and the introduction of PBL, MCQs should focus on testing the following five domains of competence:

  • Analytical skills: the ability to interpret the significance of key words, clinical data, or laboratory findings provided in the stem of the question [7].

  • Problem-solving skills: the ability to use knowledge acquired to solve problems;

  • Cognitive skills: the ability to generate hypotheses, provide justification, and make priorities on the basis of the information provided in the stem;

  • Integration of knowledge: the ability to understand the basic scientific principles and concepts related to the question;

  • Thinking holistically: the ability to evaluate the whole picture as well as the parts.

To achieve these educational objectives, it is recommended that the question stem uses a case scenario and provides 5–6 key words that summarize important findings from the medical history, clinical signs, or laboratory investigations (see Appendix I for examples of scenario-based MCQs).

What Constitutes an Ideal Scenario for an MCQ?

  • It represents a real case commonly seen in daily practice;

  • It is carefully written, with no ambiguity;

  • It leaves the examinee with no doubt about the key information provided;

  • It provides sufficient information about the patient, using key words;

  • It provides important negative history information and related clinical signs that do not show abnormalities;

  • It describes changes that happened to the patient rather than the interpretation of findings. This allows the examinee to form their own interpretation of findings;

  • It shows normal laboratory values wherever the patient's results are shown in the case;

  • It is an integral part of the questions;

  • It provides the opportunity to construct a number of questions with specific educational objectives covering issues related to the case.

Table I illustrates why a scenario is recommended instead of a statement in the stem of an integrated question.

Guidelines for Creating Scenarios

Use the following guidelines as you create your question scenarios

  • Use real life patients as the basis of your case scenario;

  • Structure your cases in a way that reflects the philosophy of the course, the level of the students, and the educational objectives of the assessment;

  • In addition to the case scenario, the stem may include a table showing laboratory results, an x-ray, or an image showing a clinical sign;

  • Each scenario may be followed by 4–5 questions related to the scenario, but each question tests a specific aspect. This might help you in integrating knowledge related to the scenario and allow students to think holistically;

  • Identify the educational objectives of the question;

  • Ensure that the question tests understanding of concepts and principles rather than trivialities.

TIP 3. ENSURE THAT THE QUESTIONS TEST UNDERSTANDING

How can you be sure that your question tests understanding rather than recall of information? To answer such a question,

  • Students need to understand the significance of key words provided in the stem scenario, or the findings provided in the table of laboratory results with the stem;

  • Students need to assess each item in the question (the five item “distractors” after the stem) against the information provided in the stem and make their decision on the basis of the information provided;

  • Students need to use an approach or the hypotheses deductive theory, which they usually use in PBL tutorials;

  • Students should not need to consult a specific textbook, and the answer to the question is not just a statement mentioned by the lecturer. The question should be useful to students studying this area of knowledge in other institutes.

The other two measures are a) the explanation of the answer reflects a process of thinking rather than a single statement; and b) distractors are part of the test items. Useful distractors keep the examinee engaged in the question. They may also contribute to the outcomes of scoring of a question.

TIP 4. ENCOURAGE INTEGRATION AND APPLICATION OF INFORMATION

Why do you need to test integration and application of knowledge?

  • To provide assessment that suits the needs of an integrated, PBL curriculum and meets students' expectations;

  • To assess whether students are able to apply information from different disciplines to develop a meaningful understanding related to a case;

  • To assess whether students are able to establish links and to explore different aspects of a concept;

  • To achieve the learning objectives identified above.

Before your team starts writing MCQs, it is important to map: areas covered in questions, the educational objectives of each question, and the strategies you will use to integrate information (see Appendix II for an example of planning and mapping of questions).

TIP 5. AVOID EACH OF THE FOLLOWING IN YOUR QUESTIONS

Table II illustrates some pitfalls that will affect the validity of the questions.

TIP 6. WORK ON FACTORS AFFECTING THE VALIDITY OF QUESTIONS

Assessment tools must be valid. Validity is defined as the extent to which an instrument is actually able to measure what it is intended to measure. Validity of MCQs might include the following two components: a) the extent to which a test is able to measure the subject matter, pre-identified cognitive skills and educational outcomes; and b) the certainty with which a test can assess students' performance. The validity of a test may include the elements seen in Table III.

It is obvious that MCQs cannot cover a range of validation this wide, but the new style of questions should be tested for content, criterion-related, and face validities. To ensure that your assessment achieves these objectives [8, 9], it is important to consider the following

  • Careful selection of content of the questions and weight of the topics to be assessed;

  • Match the cognitive skills required with your preset list of educational objectives;.

  • Avoid imprecise terms or undefined words;

  • Provide clear instructions to the examinee;

  • Avoid inconsistency in the terminology used in the curriculum and questions;

  • Provide sufficient time to answer questions;

  • Ensure that the style of the questions and their educational objectives match with the curriculum content;

  • Avoid using items that are too difficult or too obvious;

  • Randomize the location of the correct answer;

  • Encourage feedback from students and faculty members on the questions used.

TIP 7. ADJUST THE LEVEL OF THE QUESTION TO THE STUDENTS' NEEDS

The following points will help you to decide on the level of difficulty posed by the question and what is required from the students.

  • How long does it take an average student to answer a question?.

  • What is the level of complexity of the question?

  • What is the level of detail needed to answer the question?

  • Does the question match with the students' expectations?

  • Were the issues in the question covered in the curriculum at this stage?

  • Is the question focused on important principles or minute details?

TIP 8. PREPARE MODEL ANSWERS AND EXPLANATIONS

Why do I need to prepare model answers and explanations to the question?

  • Preparing model answers will allow you to discover scientific errors, grammatical mistakes, and ambiguity in the question scenario or in the distractors provided after the stem;

  • It will allow you to assess the appropriateness of each distractor and the possibility of editing or adding new distractors;

  • It will provide you with the opportunity to compare the preset learning objectives with what has been designed in the question to achieve these objectives;

  • It will provide the question reviewer with detailed model answers that will allow them to give you feedback on the questions;

  • It will give you an overall idea about the level of difficulty of the question;

  • It will allow the team responsible for putting the examination paper together to see if there are repetitions in the principles addressed in the examination.

TIP 9. ASK A COLLEAGUE TO REVIEW THE QUESTION AND MODEL ANSWERS

When you ask a colleague with expertise in the content area of the examination to review a question for you, it will be useful to ask for comments on the following issues:

  • Authenticity of the case scenario;

  • Keywords provided in the scenario and the distractors;

  • The possibility that there is more than one answer to the question;

  • Ambiguities, redundancies or other difficulties in the distractors;

  • Were there any grammatical errors?

  • Did the question achieve the preset educational objectives?

  • Was the time allocated to the question suitable?

  • Was the question appropriate to the level of the students?

  • Any comments on the model answers?

TIP 10. GIVE EXAMPLES OF THE MCQs TO YOUR STUDENTS BEFORE USING THEM IN A SUMMATIVE ASSESSMENT

Provide your students with a few examples of the new style of MCQs before implementing them in the summative assessment. It might be useful to include a few examples in the formative assessment and discuss model answers for these examples with your students.

TIP 11. ASSESS STUDENTS' PERFORMANCE ON QUESTIONS

  • Measure the degree of difficulty of the question in terms of the percentage of correct answers to the question. This is also called the facility index (FI). The FI = number of correct answers ÷ total number;

  • Measure the discrimination index (DI) by comparing the proportion of correct responses for a question from high (upper quartile of class) with low (lower quartile of class) performers. The DI = number of correct answers by the high performing group − the number of correct answers by the low performing group ÷ the number of students in the high or low performing group;

  • Questions with a facility index >0.7 and a discrimination index of >0.3 are good questions;

  • If both the facility and discrimination indexes are <0.3, you might need to review the question for scientific errors and poor wording. This question might be canceled, provided that there is a strong reason for the students' poor performance.

TIP 12. PROVIDE FEEDBACK TO YOUR STUDENTS

If the questions are used in formative assessment, arrange a session after the examination to provide your students with feedback. You will need to invite key question writers who helped you in the development of the test to discuss the questions they have written and answer students' questions. Feedback may cover the following

  • Why an answer is right or wrong;

  • A brief explanation for each item in the question;

  • Supporting evidence for the answers and possible resources for further reading;

  • Use of diagrams or overheads that facilitate discussion and answers;

  • Discuss key principles behind the question;

  • Encourage students to refer to their textbooks and lecture notes and review areas covered by questions they were unable to answer correctly;

  • Take this opportunity to discuss with them approaches for answering a scenario-based MCQ.

CONCLUSION

Scenario-based MCQs encourage students to combine the learning of facts with understanding of a number of skills, such as analytical skills, integration, application of knowledge, and problem solving skills. The questions are free from grammatical clues and are therefore more likely to assess knowledge rather than wisdom. The new style of MCQs is recommended for the summative and formative assessments, particularly in a PBL curriculum. The use of the 12 tips provided in this article will help question writers to introduce the new style of questions.

APPENDIX I: AN EXAMPLE OF SCENARIO-BASED MCQs

Questions 1–6 relate to the following clinical scenario. These questions have a single best answer. 22

Clinical Scenario

Maria Roberts is a 10-year-old primary school student on insulin for her diabetes mellitus, which was diagnosed 2 years ago. Over the last 2 days, she has complained of generalized fatigue, fever, thirst, and frequent urination. She lost her appetite, and she has nausea and has vomited twice. Because she is not eating, her mother has given her half of her usual insulin dose. Maria was admitted to the hospital for further assessment. On examination, she appeared pale, dehydrated, and is hyperventilating. Examination of the cardiovascular and respiratory systems are normal. An arterial blood, urine, and blood samples are sent to the laboratory for analysis.

  • Which one of the following sets of results in Table IV would you expect in her condition?

  • Which one of the following is the likely primary cause of the changes in her blood pH?

    • Ketoacidosis

    • Hyperventilation

    • Dehydration

    • Change in urine pH

    • Renal impairment

  • On admission to the hospital, which one of the following biochemical processes is predominant in Maria's liver?

    • Gluconeogenesis

    • Lipolysis

    • Protein breakdown

    • Fatty acid synthesis

    • Glycolysis

  • Which one of the following changes is most likely to have an important role in the pathogenesis of her diabetes?

    • Insulin resistance

    • Destruction of the alpha cells of the islet of Langerhans.

    • Impaired glucose uptake in skeletal muscles.

    • Enhanced hepatic glucose output.

    • Infiltration of the islets of Langerhans with mononuclear cells.

  • Which one of the following abnormalities would you expect to find after further investigation?

    • Inhibition of glucagons secretion.

    • Impairment of the low-density lipoprotein-receptors in the liver.

    • Raised C-peptide

    • Raised islet antigen 2 antibodies (anti-IA2).

    • Hyperinsulinemia

MODEL ANSWERS

Q1, Answer 4—

Maria has type 1 diabetes mellitus and has developed diabetic ketoacidosis (DKA)-supportive evidence for this hypothesis:

  • Her age is 10 years and is treated with insulin for diabetes mellitus, diagnosed 2 years earlier.

  • Over the last 2 days, she has developed an acute illness with fever (possibly caused by an infection) and signs suggestive of raised blood sugar and impaired diabetes control (developed frequent urination, dehydration, generalized fatigue, and was given half her usual insulin dose).

  • Patients like Maria, with type 1 diabetes, are likely to develop DKA (severe, uncontrolled diabetes with hyperglycemia and metabolic acidosis due to high circulating ketone bodies) particularly if they have acute infection and are given low or no insulin injections. DKA is due to insulin deficiency. Therefore, DKA is not known in type 2 diabetes mellitus.

In type 1 diabetes mellitus, insulin deficiency and the relative glucagons excess educed cellular uptake of glucose and accelerated catabolic processes (accelerated protein degradation, particularly in skeletal muscles, to provide amino acids as a substrate for gluconeogenesis in the liver, and increased mobilization and utilization of stored triglycerides in fat). The fatty acids liberated in this process undergo oxidation in the liver to form ketone bodies (ketogenesis), which are released into the blood.

Ketone bodies are organic acids that induce low arterial blood pH and metabolic acidosis (low blood pH and low bicarbonate). Hyperventilation and increased rate of respiration aims at compensation for her acidosis (removal of CO2 from her body and low arterial CO2). Ketone bodies are secreted in her urine (acidic urine).

Q2, Answer 1—

Hyperventilation is not the primary cause of the change in her blood pH. Hyperventilation is a response by the body to compensate for the metabolic acidosis (the aim is to wash out CO2). Lactate is a product of glycolysis in the muscle and would not be expected to be present in a diabetic state. Ketone bodies are organic acids and are the primary cause for the changes in her blood pH, as explained under item 1.

Q3, Answer 1—

Generally, insulin stimulates lipogenesis and inhibits lipolysis. Lipolysis occurs in the adipose tissue rather than the liver. Protein breakdown to amino acids occurs mainly in the skeletal muscles. Glucose utilization pathways such as glycolysis and glycogenesis are depressed in the liver of patients with untreated diabetes.

Q4, Answer 5—

Type 1 diabetes mellitus results from autoimmune destruction of the islet β cells (infiltration of the islet of Langerhans with mononuclear cells). This is the most important factor involved in the pathogenesis of type 1 diabetes. On the other hand, the pathogenesis of type 2 diabetes may include

  • Insulin resistance;

  • Impaired glucose uptake in skeletal muscles;

  • Enhanced hepatic glucose output.

Q5, Answer 4—

Hyperinsulinemia is expected in type 2 diabetes rather than in type 1. As mentioned above, insulin deficiency (as indicated by low C-peptide) and a relative increase in glucagons secretion occur in type 1 diabetes. Raised islet antigen 2 antibodies (anti-IA2) are expected to occur in type 1 diabetes, supporting the autoimmune nature of this disorder.

APPENDIX II: PLANNING FOR WRITING 5–6 MCQs AFTER A CASE SCENARIO ON DIABETIC KETOACIDOSIS

Relating the educational objectives to the areas of knowledge of each question is based on Figure 1.

Figure FIGURE 1..

Planning for writing 5–6 MCQs after a case scenario on diabetic ketoacidosis: Relating the educational objectives to the areas of knowledge of each question.

Table Table I. Advantages of using a scenario versus a statement in the stem of an MCQ
A statement as a stem of an MCQA scenario as a stem of an MCQ
Reflects a single task to the examinee (a micro-skill)Carefully written questions stimulate a number of higher levels of thinking, e.g. hypotheses generation, making decisions on the basis of evidence, and critical thinking
Aims at testing “rote” learningAllows a thinking process
Encourages recall of memorized factsMatches with real-life situations and applications of basic sciences in a clinical format
Does not allow integration of disciplinesAllows integration
Suits a traditional curriculumMatches with students' expectations in a PBL curriculum
Table Table II. Pitfalls to avoid in constructing questions
What should you avoid in your question?ReasonsExample
Double-negative statements in the distractorsThis may confuse examinees and is of limited educational valueGluconeogenesis is commonly stimulated in the absence of adequate dietary carbohydrate intake
Using abbreviations, eponyms, and acronymsMay be misinterpreted by the examineeTCA, for example, may be read as tricarboxylic acid or taurocholic acid
Clues to the correct answer and grammatical clues [4]Questions do not assess understanding of concepts but rather competency in English language and wisdomThe use of words such as “never,” “all,” “only,” “always,” “may,” “characteristically,” “rarely” in the items
Grammatical inconsistenciesSometimes grammatical inconsistencies in the distractors may lead the examinees to the right answer or distract them 
Imprecise terms or undefined words [8]Words that mislead the examinee and whose significance is difficult to interpretThe use of words such as “never,” “abundant,” “mild” or “moderate” in the stem or in the distractors; the word “never” may be interpreted by the examinee as zero, 1%, or less than 5
Long items (distractors) with pairs or triplets of reasonsThis should not be used particularly when one of the reasons is correct but not the other two; always include one short statement and avoid using the word “and”Reduced glutathione is essential for red cell membrane; its normal structure and in keeping hemoglobin in the ferrous state
Table Table III. Elements of test validity
Validity componentDefinition
Construct validityThe extent to which a test measures a theoretical trait or construct
Content validityWhether the test is carefully designed to allow good selection of related knowledge area and weight of the topics covered
Criterion-related validityEffectiveness of a test in predicting a person's skills in specified area, e.g. analysis of data, hypotheses generation, evaluation of evidence, and decision-making
Predictive validityEffectiveness to predict future performance, e.g. success of graduates in specific areas skills
Face validityPerception of the students toward the test and its relevance to their learning
Table Table IV. Patient test results
 Plasma pH N = [7.28–7.44]Plasma [HCO3] N = [21–28 mmol/l]Urine pH
1.7.5422 mmol/lAlkaline
2.7.4125 mmol/lAcid
3.7.3915 mmol/lAlkaline
4.7.3016 mmol/lAcid
5.7.2636 mmol/lAcid

Acknowledgements

I would like to thank members of the assessment teams for semesters 1–5 who kept me engaged with PBL and the need to introduce changes to the MCQ style used in assessment.

Footnotes

  1. 1

    The abbreviations used are: PBL, problem-based learning; MCQ, multiple choice question; DKA, diabetic ketoacidosis.

  2. 2

    These questions and model answers were prepared by the author and were used in the mid-semester examination of semester two, with first year medical students in 2002, at the School of Medicine, University of Melbourne, Victoria, Australia.

Ancillary