Earlier versions of this paper were presented at the 2007 annual meeting of the American Educational Research Association (Chicago, IL) and at the 2007 biennial meeting of the European Science Education Research Association (Malmö, Sweden).
Developing and assessing a force and motion learning progression†
Article first published online: 2 SEP 2008
Copyright © 2008 Wiley Periodicals, Inc.
Volume 93, Issue 3, pages 389–421, May 2009
How to Cite
Alonzo, A. C. and Steedle, J. T. (2009), Developing and assessing a force and motion learning progression. Sci. Ed., 93: 389–421. doi: 10.1002/sce.20303
The full set of force and motion items are available by contacting the first author.
Any opinions, findings, conclusions, or recommendations expressed in this paper are those of the authors. They do not necessarily represent the official views, opinions, or policy of the National Science Foundation.
- Issue published online: 3 APR 2009
- Article first published online: 2 SEP 2008
- Manuscript Accepted: 6 JUN 2008
- Manuscript Revised: 31 MAY 2008
- Manuscript Received: 14 DEC 2007
Learning progressions are ordered descriptions of students' understanding of a given concept. In this paper, we describe the iterative process of developing a force and motion learning progression and associated assessment items. We report on a pair of studies designed to explore the diagnosis of students' learning progression levels. First, we compare the use of ordered multiple-choice (OMC) and open-ended (OE) items for assessing students relative to the learning progression. OMC items appear to provide more precise diagnoses of students' learning progression levels and to be more valid, eliciting students' conceptions more similarly to cognitive interviews. Second, we explore evidence bearing on two challenges concerning reliability and validity of level diagnoses: the consistency with which students respond to items set in different contexts and the ways in which students interpret and use language in responding to items. As predicted, students do not respond consistently to similar problems set in different contexts. Although the language used in OMC items generally seems to reflect student thinking, misinterpretation of the language in items may lead to inaccurate diagnoses for a subset of students. Both issues are less problematic for classroom applications than for use of learning progressions in large-scale testing. © 2008 Wiley Periodicals, Inc. Sci Ed93: 389–421, 2009