SEARCH

SEARCH BY CITATION

The teaching of the Molecular Life Sciences in most Universities still remains teacher-centered [1–3]. Instructors impart knowledge (terms, facts, concepts) in a didactic fashion and then complement these with “laboratories” or exercises to provide practice opportunities and develop skills. In such environments, students play predominantly passive roles [3, 4]. However, results from science education research show that by getting their students actively engaged, they may do more for student learning [5–9]. This suggests that, even though replacing one's way of teaching is not easy, faculty members should move progressively toward the application of interactive educational approaches in their classrooms.

Changing one's approach to teaching requires a careful consideration of different methods. Currently there are many choices (see [10] for examples) and these keep expanding as newer methodologies are being developed. Even a cursory search by the interested teacher using the search term “based learning” in Google, would yield over 7 million items! In fact, many different formats have been named and described ranging from problem-based learning, project-based learning, peer led team learning, process oriented guided inquiry learning, inquiry-based learning, case-based learning, team-based learning, student-centered learning, active learning, cooperative learning to peer instruction, scientific teaching, and so forth. Each method has its own strong advocates of the virtues of their particular approach despite the blurred boundaries between any two particular methods. Taking Biochemistry and Molecular Biology for example, the July/August 2008 issue of BAMBED advances two methodological suggestions, other than the widely accepted “Problem-Based Learning” (the long time method alternative to traditional teaching) [11].

Choosing a method has become a problem in itself. Indeed, the issues associated with the implementation of “pure” methods, however modern or well studied they may be, are often far from trivial. Knowing “which will work” is difficult since little if any reliable empirical evidence is available for most. Providing an extensive list may in fact have the opposite impact on teachers' willingness to change, by creating more insecurity and thus inducing greater resistance to the idea of leaving the lectern.

In this discussion forum, we argue that, rather than focus excessively on choosing ONE particular method, faculty members should be concerned with providing more opportunities for interactive teaching and never to ignore the specific context in which they teach. For operational reasons, we will assume that an effective educational method is one that: 1) Motivates and inspires students; 2) Achieves cognitive engagement from the students; 3) Enhances student learning.

The empirical evidence that supports of interactive teaching approaches exists and is gradually expanding. However, finding the relevant literature requires transgressing disciplinary boundaries. Educational research on instruction at the college level is mostly performed at the disciplinary level. Unfortunately, Molecular Life Science educators generally are not aware of the substantial body of literature in other disciplines, namely medical, engineering, or physics education.

The most compelling evidence on the power of interactivity in teaching comes from the community of Physics Education Research [7]. Results come out of the application of concept inventories—The Force Concept Inventory and the Mechanics Baseline Test [12, 13]—to assess improvements in student conceptual understanding with different instructional approaches. A meta-analysis of results gathered in a plethora of high school, college and university classrooms, totaling 62 introductory physics courses and 6,542 students, with a multitude of teaching approaches [10], shows that “What works” is apparently “interactive engagement” approaches which are described by the author of the study as “methods as those designed at least in part to promote conceptual understanding through interactive engagement of students in heads-on (always) and hands-on (usually) activities which yield immediate feedback through discussion with peers and/or instructors, all as judged by their literature descriptions.” Students who were taught through the methods that qualified as interactively engaging were consistently stronger taking the same concept test than students who were instructed in the traditional way. However, “consistently stronger” does not mean “universally stronger, since the study identifies cases of student learning gains in the lower end that fulfilled the criteria to be designated interactive engaging. The author refers that “various implementation problems are apparent” in the cases such as poor training of teachers, inappropriate exam questions which do not probe conceptual understanding or the sporadic use of interactive engagement methods.

Crouch and Mazur [14] provide longitudinal evidence on the effect of a highly interactive teaching approach designed for large auditoriums–Peer Instruction. The approach has been applied with large classes in lecture halls, with positive effects on learning in diverse institutions such as Harvard University and a 2-year college [15]. The information gleaned from Physics Education Research, performed by disciplinary experts, is that leaving the lectern is clearly beneficial for student learning, particularly in conceptually difficult topics and that effectiveness is not the prerogative of a single particular method [16].

What immediate impacts can one expect from shifting to interactive teaching? A positive consequence, regardless of the contexts is the success with student involvement with class. Consistently, answers to questionnaires (for example [17, 18]) or reported teacher observations (for example [17, 19]) lead to the conclusion that interactivity induces more student commitment. However, it is more difficult to demonstrate changes in learning. Part of this stems from the fact that interactive teaching fosters more complex processing of information and these may not be readily assessed on simpler learning tasks. Michael [20] has summarized the evidence that active learning works, but to look for this evidence one needs to stray beyond conventional domains. In this regard, Biochemistry and Molecular Biology Education educators will need to document “what works” in our field: Concept Inventories under development in Biology [21–23] will pave the way for Disciplinary experts to obtain the necessary evidence on the effectiveness of any particular teaching method. The community will benefit greatly from that evidence.

In the meantime, we provide some tips to those who want to change but are unsure as to where or how they would start. Given that most readers of this journal are likely to be experimental scientists, we will frame our suggestions in a way that may be familiar to them. All experiments are ultimately interventionist, where a system is perturbed, the outcomes assessed and reflected upon.

Teachers who wish to institute a change in their teaching practice should look upon their situation in much the same way as they would approach an experimental one. They should consider the reasons why they want to produce a change in their teaching strategies and frame that as a problem to be solved. They should consider carefully the context of their practice and the resources available (time, support, facilities, etc). They should consider the methods available in relation to their own problems, either through consulting the literature, attending educational meetings or symposia, or discussing the matters with colleagues. Again these are strategies commonly used in any experimental situation. They should decide on measurable outcomes, again such information can be gleaned from the sources mentioned earlier. They should institute small, incremental changes rather than invoking major changes in one fell swoop. Thus a teacher wanting to promote student engagement in the lecture hall may find it easier to introduce short and easy methods like the 1-min paper [24] or other “active learning” strategies [25]. This would give both the teacher and the students the initial confidence to make more drastic changes. Teachers who institute such changes should carefully document their observations, obtain feedback either from their students or colleagues, reflect on their practice, and take appropriate action. It is essential to anticipate student resentment on approaches that aim at changing their learning approaches by explaining in advance what will and why it will be done. Sitting in classes of colleagues, attending educational sessions at professional Meetings—like those that take place annually at the ASBMB Meeting—or faculty development sessions like the ones organized by IUBMB is useful. Rigor should be put on documenting the impact and the effectiveness of one's teaching changes on students. A source of “Classroom assessment techniques” is available [26] and a very useful website exists for developing personalized student surveys [27].

Feedback and support from the educational community is surprisingly easy to obtain since all colleagues who have committed themselves to “change” are aware that solitary endeavors fail often. The rate-limiting step is normally contacting a peer, which can still be done electronically. To get involved with a community that actively reflects on these issues the Professional and Organizational Development network [28] is a good choice.

A word of caution on methods. Standard approaches in an experimental biological situation (like the use of a control group) are fraught with difficulties when transferred to educational settings [29]. For example, the standard model that serves well even in a clinical trial such as a placebo cannot be applied. What after all will be a placebo in education—no teaching? If an innovative educational approach is being tested, the control could be the standard teaching practice rather than no teaching, much as a standard drug in a clinical trial. It is worth emphasizing that even in such studies, the assumption that students do not interact outside the classroom is naïve. Patients on placebos may not exchange drugs, but students often in the same classroom or university may share information. Therefore the true separation between treatments one envisages is rarely kept, so contamination may be the norm rather than the exception. Kember's criticisms are worth keeping in mind and we suggest that readers take pause and reflect on such issues [29]. Approaches that involve triangulation across multi-method evaluations from different sources are preferable [29, 30]. Readers should read these comments carefully so that they do not fall into the control trap.

Every teaching/learning method has been developed in a specific learning environment. Approaches that work well in one situation may be far from ideal in another. Therefore, each user should not be looking for the superior method. Instead teachers should engage in self-reflection to increase the interactivity according to the individual needs and priorities, since selection of a specific approach requires careful consideration of context. Courses may be taught in different classroom environments—for example, introductory biochemistry may enroll 400 motivated students or 20 unmotivated ones, which will call for different designs of teaching. Courses can also be staged in institutions with implemented policies of teaching in small groups or in others that have been lecture based for 100 years. How a course is set up, also is tied to the resource availability, such as technology or teaching assistants. Thus, without careful thought to context, even the best approaches may fail. The learning environment may thus defeat best laid plans. Rather than starting with the question “what is the method that works?” teachers should be asking “what are the goals of my course and what is the context in which that course is being offered?” The final message is “stop cloning,” context overrides all.

Acknowledgements

  1. Top of page
  2. Acknowledgements
  3. REFERENCES

This Commentary was inspired by the deliberations at the recent event held through the auspices of IUBMB: “The power of Interactive teaching: a hands-on workshop” at the School of Health Sciences, University of Minho (Portugal). We thank the IUBMB for sponsoring the event and the workshop facilitators—Drs. Nathaniel Lasry (John Abbott College, Canada), Janine Henderson, and John Lewis (Hull York Medical School, UK)—for the stimulating discussions which have done so much for nurturing the ideas in this commentary.

REFERENCES

  1. Top of page
  2. Acknowledgements
  3. REFERENCES
  • 1
    E. Bell ( 2001) The future of education in the molecular life sciences. Nat. Rev. Mol. Cell Biol. 2, 221225.
  • 2
    E. J. Wood ( 2001) Biochemistry and molecular biology teaching over the past 50 years. Nat. Rev. Mol. Cell Biol. 2, 217221.
  • 3
    Board of life sciences ( 2003) BIO2010: Transforming Undergraduate Education for Future Research Biologists, National Academy Press, Washington, DC.
  • 4
    R. Boyer ( 2003) Concepts and skills in the biochemistry/molecular biology lab. Biochem. Mol. Biol. Educ. 31, 102105.
  • 5
    J. D.Bransford,A. L.Brown,R. R.Cocking, Eds. ( 2000) How People Learn: Brain, Mind, Experience and School, National Academy Press, Washington, DC.
  • 6
    J. Handelsman,D. Evert-May,R. Beichner,P. Bruns,A. Chang,R. DeHaan,J. Gentile,S. Lauffer,J. Stewart,S. M. Tilghman,W. B. Wood ( 2004) Scientific teaching. Science 304, 521522.
  • 7
    R. R. Hake ( 1998) Interactive-engagement vs traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. Am. J. Phys. 66, 6474.
  • 8
    M. Prince ( 2004) Does active learning work? A review of the research. J. Eng. Educ. 93, 223231.
  • 9
    G. R. Norman,H. G. Schmidt ( 2000) Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Med. Educ. 34, 721728.
  • 10
    Interactive-engagement methods in introductory mechanics courses. Available at: http://www.physics.indiana.edu/%7Esdi/IEM-2b.pdf (accessed on January 2, 2009).
  • 11
    T. Eberlein,J. Kampmeier,V. Minderhout,R. S. Moog,T. Platt,P. Varma-Nelson,H. B. White ( 2008) Pedagogies of engagement in science. Biochem. Mol. Biol. Educ. 36, 262273.
  • 12
    D. Hestenes,M. Wells,G. Swackhamer ( 1992) Force concept inventory. Phys. Teacher 30, 141158.
  • 13
    I. Halloun,D. Hestenes ( 1985) The initial knowledge state of college physics students. Am. J. Phys. 53, 10431055.
  • 14
    C. Crouch,E. Mazur ( 2001) Peer instruction: Ten years of experience and results. Am. J. Phys. 69, 970977.
  • 15
    N. Lasry,E. Mazur,J. Watkins ( 2008) Peer instruction: From Harvard to community colleges. Am. J. Phys. 76, 10661069.
  • 16
    R. R. Hake ( 2007) Six lessons from the physics education reform effort. Latin Am. J. Phys. Educ. 1, 2431.
  • 17
    P. K. Rangachari,U. Rangachari ( 2007) Information literacy in an inquiry course for first-year science undergraduates: A simplified 3C approach. Adv. Physiol. Educ. 31, 176179.
  • 18
    J. C. Sousa,M. J. Costa,J. A. Palha ( 2007) Hormone-mediated gene regulation and bioinformatics: Learning one from the other. PLoS ONE 2: e481.
  • 19
    P. K. Rangachari ( 2006) Promoting self-directed learning using a menu of assessment options: The investment model. Adv. Physiol. Educ. 30, 181194.
  • 20
    J. Michael ( 2006) Where's the evidence that active learning works? Adv. Physiol. Educ. 30, 159167.
  • 21
    K. Garvin-Doxas,M. Klymkowsky,S. Elrod ( 2007) Building, using, and maximizing the impact of concept inventories in the biological sciences: Report on a National Science Foundation-sponsored conference on the construction of concept inventories in the biological sciences. CBE Life Sci. Educ. 6, 277282.
  • 22
    J. Michael,J. McFarland,A. Wright ( 2008) The second conceptual assessment in the biological sciences workshop. Adv. Physiol. Educ. 32, 248251.
  • 23
    S. Howitt,T. Anderson,M. J. Costa,S. Hamilton,A. Wright( 2008) A concept inventory for molecular life sciences: How will it help your teaching practice? Aust. Biochem. 39, 1417.
  • 24
    D. Stead ( 2005) A review of the one-minute paper. Active Learn. Higher Educ. 6, 118131.
  • 25
    R. M. Felder ( 1997) Beating the numbers game: Effective teaching in large classes, Proceedings of the American Society of Engineering Education Annual Conference, Milwaukee, USA. Available at: http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Large classes.htm (accessed on January 2, 2009).
  • 26
    T. A. Angelo,K. P. Cross ( 1993) Classroom Assessment Techniques, A Handbook for College Teachers, 2nd ed., Jossey-Bass, San Francisco.
  • 27
    Office of Institutional Research & Assessment Item bank, Syracuse University. Available at: http://oira.syr.edu/Assessment/StudentRate/CreateForm.htm (accessed on January 2, 2009).
  • 28
    Professional and Organizational Development Network. Available at http://www.podnetwork.org/ (accessed on January 2, 2009).
  • 29
    D. Kember ( 2003) To control or not to control: The question of whether experimental designs are appropriate for evaluating teaching innovations in higher education. Assess. Eval. Higher Educ. 28, 89101.
  • 30
    R. Hake, in A. E.Kelly,R. A.Lesh,J. Y.Baek, Eds. ( 2008) Handbook of Design Research Methods in Mathematics, Science, and Technology Education, Erlbaum, Mahwah, NJ, pp. 493508.