SEARCH

SEARCH BY CITATION

Keywords:

  • Enzymes;
  • substrates;
  • misconceptions;
  • assessment;
  • concept inventory

Abstract

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

Enzyme function is central to student understanding of multiple topics within the biochemistry curriculum. In particular, students must understand how enzymes and substrates interact with one another. This manuscript describes the development of a 15-item Enzyme–Substrate Interactions Concept Inventory (ESICI) that measures student understanding of enzyme–substrate interactions. The validity and reliability of ESICI data were established through multiple methods. Results from the administration of the ESICI to biochemistry students across the United States (N = 707) are discussed in terms of instrument quality. The manuscript concludes with suggestions for how to use the ESICI for both teaching and biochemistry education research.

Multiple studies have investigated the misconceptions of students in introductory courses such as general chemistry and biology [1–4], but little research has investigated students' understandings in upper-level courses such as biochemistry and molecular biology (BMB) [5–8]. BMB educators need an instrument that is both precise and accurate in order to measure students' understandings in a timely manner. Although concept inventories have been developed in BMB [9, 10], the Enzyme–Substrate Interactions Concept Inventory (ESICI) is the first concept inventory to move beyond prerequisite knowledge for biochemistry and measures a fundamental concept to be learned in BMB courses, namely enzyme–substrate interactions. The ESICI is a 15-item paper–pencil multiple-choice test that can be used pre- and post-instruction to measure the effect of instruction on students' conceptual understanding.

BACKGROUND

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

Students first learn about enzyme–substrate interactions in high school biology [11] and then again throughout college biology and chemistry courses [12]. Students, therefore, bring multiple ideas regarding enzyme–substrate interactions to college biochemistry courses. Constructivist learning theory describes how students use prior knowledge as a lens to learn new information, adding to, reconstructing, or adapting their prior knowledge to build understanding [13]. Because students' prior knowledge of enzymes can influence their understanding of core biochemistry concepts such as enzyme kinetics, regulation of metabolic pathways, and transcription and translation [12], it is important to have an instrument that will enable instructors to assess their students' understandings of fundamental concepts related to enzyme–substrate interactions.

Multiple choice assessments known as concept inventories, with distractors based on students' incorrect ideas, offer an efficient measure of student understanding [14]. The majority of concept inventories use instructor experience and literature reviews to design questions [15, 16] despite the National Research Council's call [15] for the use of interviews to determine students' conceptual understanding when designing educational assessments. Questions designed via the “top-down” procedure result in questions and distractors worded in the language of the instructor, creating a potential threat to validity by possible omission of students' misconceptions and fragmented ideas. By contrast, the ESICI has been designed through a bottom-up methodology in which students' language and misconceptions were used to construct both questions and item distractors.

METHODOLOGY

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

Our goal in developing the ESICI was to design a concept inventory grounded in student understanding that would be able to measure the thinking of a large, diverse sample of biochemistry students. The instrument needed to produce both valid and reliable data, and in doing so, be able to distinguish among students with different levels of biochemistry knowledge.

With these goals in mind, the ESICI was developed based on the misconceptions detected during interviews with both undergraduate and graduate students (N = 25) from biochemistry courses at a large predominately undergraduate institution in the midwestern United States. Students were asked to interpret a pair of representations depicting enzyme–substrate interactions. Two pairs of representations were used: (1) iconic images of the “lock-and-key” and the “induced-fit” models and (2) molecular representations of trypsin and a substrate binding. Analysis of the interviews revealed common misconceptions regarding enzyme–substrate interactions across five categories: enzyme and substrate characteristics, the role of shape and charge in selectivity, how the enzyme interacts with the substrate, competitive vs. noncompetitive inhibition, and conformational change. (Detailed discussions of these misconceptions will be described in future manuscripts.) Analysis of students' understandings of the representations revealed significant contradictions and confusion regarding the nature of enzyme–substrate interactions [17].

Based on these categories of misconceptions, a pilot version of the ESICI was developed in order to quantify the prevalence of these misconceptions with a larger sample. Questions and distractors stemmed directly from student interviews. The questions were reviewed by an expert panel of ten research-active biochemistry professors to determine the content and face validity of the questions [18]. The pilot version of the ESICI was administered to students (N = 108) in two biochemistry courses, two weeks after their course exam on enzyme–substrate interactions and enzyme kinetics. Students (N = 10) from both courses were interviewed within a week and asked to “think aloud” as to why they chose a specific answer and ruled out other possible answers for each question [19]. This process provided an additional check of validity by investigating whether the students chose correct answers for correct reasons. Based on responses from the expert review, student interviews and item analysis, questions were revised.

The final version of the ESICI consists of 15 items (many of which use molecular and abstract representations) across the five categories of misconceptions discussed above. The ESICI also includes one item from the Biology Concept Inventory [4] that measures enzyme–substrate interactions as an additional form of validity. The final 15-item ESICI was administered to a larger population of biochemistry students (N = 788) in 17 courses from 16 institutions across the United States in order to further test the reliability and validity of the instrument. The results of 707 students who answered all 15 items are reported. The ESICI was administered to students enrolled in one of four courses: a one semester survey course composed predominately of students majoring in dietetics, exercise science, and pre-professional students with intentions of going to medical school; a one semester survey course for students chemistry majors; the second semester of a year-long course sequence for biochemistry majors; and finally a biochemistry seminar required of all biochemistry majors (who had previously taken a biochemistry course). These courses were selected because students were taught about enzyme–substrate interactions in each of these courses for approximately 2 wks. Students required ∼20 min to complete the ESICI. The ESICI was administered by each course instructor at least 2 wks after his/her course exam on enzyme–substrate interactions and enzyme kinetics.

Participants

Of the 707 students responding, 57.3% were females and 78% were Caucasian. The academic majors of the students included 32% biology, 20% pre-health, 20% BMB, 14% nutrition, 14% exercise science, 8% chemistry, and 6% other.

Data Analysis

Student responses were coded 0 for an incorrect response and 1 for a correct response. Descriptive statistics and reliability coefficients were calculated using SPSS® statistical software version 16.0. Item and test psychometrics (i.e. item difficulty, item discrimination, and Ferguson's δ) were determined. These psychometrics provided evidence regarding the quality of measurement both at the level of an individual question, and for the inventory as a whole.

RESULTS

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

Individual student scores ranged from 1 to 15, with a mean of 8.32 ± 2.50. Ferguson's δ (a measure of how broadly the total scores are distributed over the possible range of scores) for this sample was δ = 0.949, meaning that this sample was distributed over 94.9% of the possible range of total scores. An acceptable value is δ ≥ 0.90 [20]. The results were significantly different from a normal distribution [Kolmogorov–Smirnov statistic, D (707) = 0.087 p < 0.001], so nonparametric techniques were used for data analysis. Classical test theory showed that the majority of the 15 ESICI items functioned in the acceptable range of difficulty, discrimination, and item reliability (see Table I).

Table I. Psychometrics for the 15 ESICI items
ItemItem difficulty (ρ)Item discrimination (D)Item reliability (rpbi)
  • a

    Easy items (ρ ≥ 0.80).

  • b

    Difficult items (ρ ≤ 0.25).

  • Note: Items in italics fall within the ideal range of discrimination (D ≥ 0.3) and point biserial (rpbi ≥ 0.2).

10.873a0.1780.204
20.4760.3610.288
30.4190.2040.166
40.2760.3660.335
50.7510.5080.470
60.7880.4450.440
70.4320.3190.269
80.3680.4080.346
90.6530.4240.367
100.4490.4610.369
110.235b0.2510.212
120.6950.6340.553
130.5240.5860.485
140.7410.4400.423
150.7410.5500.423

In order to establish concurrent validity, the data were analyzed by student major to determine whether students with more instruction regarding the concepts of enzyme–substrate interactions would score higher on the ESICI than students at lower levels of biochemistry. BMB and chemistry majors recorded the highest median score (Md = 10.00) and nutrition and exercise science (NXS) majors recorded the lowest median score (Md = 6.00) (see Fig. 1). A Kruskwal–Wallis test revealed a statistically significant difference in total score on the ESICI across the six categories of major (see Table II). In order to determine the significant difference between the individual majors, a series of pairwise comparisons using the Mann–Whitney U test were conducted. Based on the 15 pairwise comparisons made, the Bonferroni corrected alpha value for the comparison of majors was α = 0.003. The Mann–Whitney U tests (see Table II) revealed NXS majors scored significantly lower than the additional majors with effect sizes ranging from small to large. Both pre-health and biology majors scored significantly lower than chemistry majors and BMB majors. Therefore, as students learn more about enzyme–substrate interactions (e.g. BMB majors), it appears that they performed better on the ESICI, a finding that further validates the ESICI as a good measure of student understanding of enzyme–substrate interactions. Also worth noting is that the ESICI did not result in a “ceiling effect,” i.e. chemistry and BMB majors did not automatically earn the highest score possible and their differences in knowledge were detected by the ESICI.

thumbnail image

Figure 1. Students' total score on the ESICI by academic major.

Download figure to PowerPoint

Table II. Students' performance on the ESICI by academic major
Omnibus testDfNχ2p-ValueEffect size (r)
Kruskal–Wallis570797.6870.000
Pairwise comparisonsNMann–Whitney UZ Statisticp-value
  • Note: NXS = nutrition/exercise science; BMB = biochemistry/molecular biology

  • *

    p < 0.003

PreHealth—NXS2424,7254.4720.000*0.287
Biology—NXS3246,9295.5240.000*0.307
Other—NXS1411,1534.1090.000*0.346
Chemistry—NXS1611,1036.8460.000*0.540
BMB—NXS2392,4768.5390.000*0.552
Biology—PreHealth36614,4581.4780.140
Other—PreHealth1832,4691.4940.135
Other—Biology2654,3110.6280.530
Chemistry—PreHealth2032,7164.2460.000*0.298
BMB—PreHealth2815,9725.7660.000*0.344
Chemistry—Biology2854,9973.2400.001*0.192
BMB—Biology36310,9244.8120.000*0.253
Chemistry—Other1029971.7490.080
BMB—Other1802,1752.3210.020
BMB—Chemistry2004,0620.4750.635

The internal consistency of the ESICI was α = 0.53, as measured by the Cronbach α which indicate how closely the questions measured the same construct [21]. Adams and Wieman [16] have argued that the internal consistency coefficient, although typically reported in assessment literature with a minimum accepted value of 0.7, may not be appropriate for a concept inventory and additional means of reliability need to be determined. Our data support this argument that Cronbach α values less than 0.7 may result when measuring students' misconceptions, e.g. when using a concept inventory. Per Ausubel and Novak's theory of meaningful learning [22], a learner's knowledge often contains disconnected ideas, incorrectly linked concepts, and/or key information may be altogether missing. Therefore, as an assessment that targets misconceptions and was developed based on detailed interviews with students, the ESICI would not likely measure a highly connected knowledge structure which is necessary in order to achieve a Cronbach α value greater than 0.7. Rather, the ESICI explicitly measures the gaps and incorrect links in student knowledge about enzyme–substrate interactions. Consequently, a Cronbach α value of 0.53 seems appropriate as an indicator of internal consistency.

As an alternative to the Cronbach α, Adams and Wieman [16] suggested the reliability of a concept inventory be determined by testing similar populations and calculating a stability coefficient. To do this, the ESICI was administered twice, one month apart, to a one-semester biochemistry survey course for dietetics, exercise science, and pre-professional students (N = 54) at a large, predominately undergraduate institution in the midwestern United States. The student sample for the second administration of the ESICI performed similarly to the original total sample in terms of descriptive statistics (see Table III). A Wilcoxon signed ranks test indicated no significant difference between the students' performance on the two administrations (z = −0.636, p = 0.525). The correlation between student scores across the two administrations was strong and positive (ρ = 0.559, n = 54, p < 0.001). The high correlation of item difficulty values between the test and retest administrations (ρ = 0. 907, n = 15, p < 0.001) also suggested that the items performed consistently (i.e. difficult items remained difficult, easy items remained easy).

Table III. Comparison of the descriptive statistics for the ESICI between the full sample (N = 707) and retest samples (N = 54)
SampleNMeanMedianStandard DeviationMin.Max.SkewnessKurtosis
Full7078.328.002.501115−0.019−0.455
Retest Admin 1547.528.002.612314−0.019−0.455
Retest Admin 2547.818.002.6283130.241−0.589

In summary, the whole of these item and test psychometrics have established that the ESICI can be used to collect valid and reliable data when assessing student misconceptions related to enzyme–substrate interactions. As described above, ESICI items measure the categories of misconceptions detected during the student interviews. Table IV summarizes these five categories and the specific misconceptions measured by the ESICI. Future manuscripts elaborating upon student thinking with regard to each misconception are planned. Further analysis indicated that 85% of students held at least one misconception in three or more of the five categories of misconceptions. Only 3 of 707 students correctly answered all 15 items on the ESICI, indicating they did not hold any of the misconceptions measured by the ESICI.

Table IV. Misconceptions detected by ESICI
Misconception (item distractor)% students
  1. Note: Numbers in parentheses represent the item distractor(s) that measure the misconception. The percentages in italic indicate distractors selected at a ratio greater than chance.

Role of shape and charge in selectivity
 Charged amino acid interacts with OH regardless of sterics (7a, 7d, 3d)50.9, 41.5
 Students consider charge but not shape (9c)26.6
 Similar amino acid will bind in pocket (3b)12.2
How the enzyme interacts with the substrate
 Disregard of relationship between scissile bond and specificity interaction (11c)44.6
 Enzyme will bind most tightly to the substrate (4a)37.9
 Active site is the only place of interaction (11a, 12d)27.7, 13.3
 Allosteric site is not a binding site (2b, 2d)27.2
 Binding only occurs at transition state (4d)20.8
 Active site is on the substrate (12a)15.7
 Specificity pocket is not a binding site (2a, 2d)14.4
 Enzyme will bind most tightly to the active site (4b)13.6
Competitive vs. noncompetitive inhibition
 Inhibitors can bind to the substrate (10d)33.5
 Inhibitors interact only via competitive inhibition (10a, 6a)18.8, 15.3
Conformational change
 Allosteric effecter must change enzyme conformation (15c)28.7
 An enzyme must change conformation prior to interacting with substrate (14b)12.3
Enzyme and substrate characteristics
 Solvent cannot be a substrate (8c)27.6
 Enzyme is a protein therefore a protein cannot be a substrate (8d)18.8
 The “key” images represent the enzyme (5b)17.7
 Nucleotides cannot be substrates (8b)16.8

IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

As a diagnostic instrument, the ESICI has multiple potential uses within the classroom. Instructors could use the questions to formatively assess student understanding of enzyme–substrate interactions by presenting items either prior to instruction as a means of “just-in-time” teaching [23]. Items on the ESICI could be given during lecture using a classroom response system [24] to gauge student understanding and to surface student misconceptions that could be directly addressed during lecture. Researchers and instructors could use the ESICI to measure the effectiveness of curricular developments involving enzyme–substrate interactions by administration of pre- and post-instruction to determine how the instruction specifically influenced student understanding on the specific topic. In addition, researchers could use the instrument as a base to delve deeper into student understanding of enzymes. Those interested in gaining access to the ESICI can contact the corresponding author for a copy of the instrument.

Acknowledgements

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES

This work was supported by grant No. 0733642 from the National Science Foundation. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

REFERENCES

  1. Top of page
  2. Abstract
  3. BACKGROUND
  4. METHODOLOGY
  5. RESULTS
  6. IMPLICATIONS FOR TEACHING AND FUTURE RESEARCH
  7. Acknowledgements
  8. REFERENCES
  • 1
    H. D. Barke, A. Hazari, S. Yitbarek ( 2009) Misconceptions in Chemistry: Addressing Perceptions in Chemical Education, Springer, Verlag, Germany.
  • 2
    C.-Y. Tsui, D. Treagust ( 2009) Evaluating secondary students' scientific reasoning in genetics using a two-tier diagnostic instrument. Int. J. Sci. Educ. 32, 10731098.
  • 3
    D. L. Anderson, K. M. Fisher, G. J. Norman ( 2002) Development and evaluation of the conceptual inventory of natural selection. J. Res. Sci. Teach. 39, 952978.
  • 4
    M. W. Klymkowsky, S. Underwood, K. Garvin-Doxas ( 2010) The Biological Concepts Instrument (BCI), a diagnostic tool to reveal student thinking. arXiv:1012.4501v1.
  • 5
    T. R. Anderson, L. G. Crossley, D. J. Grayson, in M. Komorek,H. Behrendt,H. Dahncke,R. Duit,W. Graber,A. Kross, Eds. ( 1999) Identifying students' conceptual and reasoning difficulties with biochemistry, 2nd International Conference of the European Science Education Research Association (ESERA). ESERA, Kiel, Germany, pp. 8688.
  • 6
    T. L. Hull ( 2003) Students' Use of Diagrams for the Visualization of Biochemical Processes. Unpublished M.Sc., University of KwaZulu-Natal, South Africa.
  • 7
    K. J. Schönborn, T. R. Anderson, D. J. Grayson ( 2002) Student difficulties with the interpretation of a textbook diagram of immunoglobulin G (IgG). Biochem. Molec. Bio. Educ. 30, 9397.
  • 8
    M. K. Orgill, A. Sutherland ( 2008) Undergraduate chemistry students' perceptions of and misconceptions about buffers and buffer problems. Chem. Educ. Res. Pract. 9, 131143.
  • 9
    S. Howitt, T. Anderson, M. Costa, S. Hamilton, T. Wright ( 2008) A concept inventory for molecular life sciences: How will it help your teaching practice? Aust. Biochemist 39, 1417.
  • 10
    S. M. Villafañe, C. P. Bailey, J. Loertscher, V. Minderhout, J. E. Lewis ( 2011) Development and analysis of an instrument to assess student understanding of foundational concepts before biochemistry coursework. Biochem. Mol. Bio. Educ. 39, 102109.
  • 11
    National Academy of Sciences ( 1996) National Science Education Standards, National Academy of Sciences, Washington, DC.
  • 12
    J. G. Voet, E. Bell, R. Boyer, J. Boyle, M. O'Leary, J. K. Zimmerman ( 2003) Recommended curriculum for a program in biochemistry and molecular biology. Biochem. Mol. Bio. Educ. 31, 161162.
  • 13
    G. M. Bodner, M. Klobuchar, D. Geelan ( 2001) The many forms of constructivism. J. Chem. Educ. 78, 1107.
  • 14
    AAAS ( 2011) Vision and Change in Undergraduate Biology Education: A Call to Action, AAAS, Washington, DC.
  • 15
    National Academy of Sciences ( 2001) Knowing What Students Know. The Science and Design of Educational Assessment, National Academy of Sciences, Washington, DC.
  • 16
    W. K. Adams, C. E. Wieman ( 2011) Development and validation of instruments to measure learning of expert-like thinking. Int. J. Sci. Educ. 33, 12891312.
  • 17
    K. J. Linenberger, S. L. Bretz ( 2012) Generating cognitive dissonance in student interviews through multiple representations. Chem. Educ. Res. Pract., DOI: 10.1039/C1RP90064A.
  • 18
    W. M. K. Trochim ( 2006) Social Research Methods Knowledge Base:www.socialresearchmethods.net (last accessed July 20, 2011).
  • 19
    C. W. Bowen ( 1994) Think-aloud methods in chemistry education: Understanding student thinking. J. Chem. Educ. 71, 184.
  • 20
    G. A. Ferguson ( 1949) On the theory of test discrimination. Psychometrika 14, 6168.
  • 21
    L. J. Cronbach ( 1951) Coefficient alpha and the internal structure of tests. Psychometrika 16, 297334.
  • 22
    D. P. Ausubel, J. D. Novak, H. Hanesian ( 1978) Educational Psychology: A Cognitive View, 2nd ed., Werbel & Peck, New York.
  • 23
    G. M. Novak, E. T. Patterson, A. D. Gavrin, W. Christian ( 1999), Just-in-Time Teaching: Blending Active Learning with Web Technology, Prentice Hall, Upper Saddle River, NJ.
  • 24
    M. R. Asirvatham ( 2010) Clickers in Action: Increasing Student Participation in General Chemistry, W.W. Norton & Company, New York, NY.