Developing and assessing a force and motion learning progression


  • Earlier versions of this paper were presented at the 2007 annual meeting of the American Educational Research Association (Chicago, IL) and at the 2007 biennial meeting of the European Science Education Research Association (Malmö, Sweden).

    The full set of force and motion items are available by contacting the first author.

    Any opinions, findings, conclusions, or recommendations expressed in this paper are those of the authors. They do not necessarily represent the official views, opinions, or policy of the National Science Foundation.


Learning progressions are ordered descriptions of students' understanding of a given concept. In this paper, we describe the iterative process of developing a force and motion learning progression and associated assessment items. We report on a pair of studies designed to explore the diagnosis of students' learning progression levels. First, we compare the use of ordered multiple-choice (OMC) and open-ended (OE) items for assessing students relative to the learning progression. OMC items appear to provide more precise diagnoses of students' learning progression levels and to be more valid, eliciting students' conceptions more similarly to cognitive interviews. Second, we explore evidence bearing on two challenges concerning reliability and validity of level diagnoses: the consistency with which students respond to items set in different contexts and the ways in which students interpret and use language in responding to items. As predicted, students do not respond consistently to similar problems set in different contexts. Although the language used in OMC items generally seems to reflect student thinking, misinterpretation of the language in items may lead to inaccurate diagnoses for a subset of students. Both issues are less problematic for classroom applications than for use of learning progressions in large-scale testing. © 2008 Wiley Periodicals, Inc. Sci Ed93: 389–421, 2009